Thursday, January 29, 2015
By the end of Resolving In Place Alters, Part 3 I was able to read the raw data pages that belong to the table with pending IPAs, but I wasn't able to see if a data page had a pending IPA or not. In this installment I will show you how I solved that problem using what I know about the Informix Data Page structure and the structure of my table before and after the in place alter was performed.
Anatomy of the Informix Data Page
When Informix stores data on disk, it doesn't just write your row data. It also writes some "housekeeping" data at the beginning of the page and at the end of the page that help the engine to know what actually is stored in this data page.
The first 24 bytes of a page contain the page header with all kinds of information that you're going to have a hard time finding documentation about. The last N bytes of a data page contain a timestamp and the slot table, which you may have better luck finding some documentation on. In between the page header and page footer (slot table + timestamp) you'll find your actual data. We will need information from each of the 3 parts of the data page.
Friday, January 23, 2015
In Resolving In Place Alters - Part 2 I decided that I want to try to identify only the pages with a pending IPA and only "fix" those data pages in an attempt to speed up the resolving of IPAs and limit the work load added to the system when doing so.
To do this, I will need to read the actual data pages stored on disk and use information in the page header and slot table to determine if a page needs to be fixed. More on this in the next blog, today I just want to talk about how to read in the raw data pages.
Posted by Andrew Ford at 4:37 PM
Thursday, January 15, 2015
In Part 1 I talked about how I knew that the prescribed method of resolving the pending in place alters in my large table by simply issuing a single large dummy update statement to update each row won't work for large tables. There are just too many problems with long transactions, locks and excessive I/O that can trash the engine.
I decided the first thing I would try would be to update every row in the table, but do it in multiple transactions to avoid long transactions and holding on to a lot of locks. Nothing too interesting, pretty standard stuff really. The logic looks like this:
Posted by Andrew Ford at 5:43 PM
Friday, November 21, 2014
I Sure Did Miss You
Truth be told, I just haven't had much to write about until now. I haven't been doing much of anything interesting with Informix. Sure, I've been doing the day to day Informix DBA stuff, but nothing blog worthy. Mostly I've been working on some software development, data mining and business optimization projects for my company. But that all changed this week.
I want to upgrade to v12.1, but I'm a big fat chicken and I've got a pending in place alter on my most importantest of important tables. The one that holds account information and balances for over 1 billion accounts and the manual tells me I should probably resolve all of my IPA's before performing an in place upgrade. Could I take a chance and upgrade the engine without following the manual's advice? Sure. Have I done it before without problems? Yeah, but I'm not proud of it and they weren't on our most precious table, so this time I'm going to follow the rules.
Part 1 - What to Expect and What I Knew Wouldn't Work
This will be a 89 part series (give or take a few) on how I accomplished this task without any downtime, without killing performance, without locking up the table and most importantly without updating every gosh darned one of those 1 billion rows like the manuals suggest you do. (Ok, full disclosure, I'm still working on the last little bit, but I think I've got this nut cracked.)
I have submitted a proposal for IIUG2015 on this topic, so if you don't feel like reading you can just wait for the movie. You will have to buy a ticket and they will be on sale soon. If you too would like to present something at IIUG2015 and receive a complimentary pass, there is still time to submit a proposal. This is the 20th anniversary of the first Informix conference in Chicago and it will be fun and educational as always.
What I knew wouldn't work - The manuals seem to assume we are all operating with tables with only a handful of rows and their simple solution to resolving IPA's is to just dummy update every row in the table.
update your_tiny_table set your_field = your_field where 1 = 1;
Well, we all know that will cause havoc on a system if you're updating a table with any substantial size. Locks, long transactions, evil I/O, oh my. There has to be a better way and there is. Next time I'll show you the first thing I tried. (Spoiler alert, it didn't work either.)
Posted by Andrew Ford at 2:08 PM
Friday, March 21, 2014
You may have already read this article by my friend and "Benchmark Guy", Eric Vercelletto, but it is so well done and contains so many great thoughts and observations that I think it needs to be posted here for you to read again (or for the first time).
Where is Informix?
Posted by Andrew Ford at 8:06 AM
Tuesday, December 17, 2013
I posted this on the IDS SIG yesterday. Here is it again in case you missed it and are interested.
The IIUG2014 CPC met at the JW Marriott in Miami, FL this weekend to prepare for the 2014 conference. I thought I would give an update on where we are at and pass along my experience with the city and hotel since this will be a new location for a lot of us in April.
The hotel is beyond nice. The place is 98.34% Marble and Mahogany and smells terrific. Yes, it is weird to compliment a hotel on their smell, but I thought this to myself each time I entered the lobby. It is the nicest hotel I have ever stayed in and unless I hit the Florida Lotto, it will be the nicest hotel I ever stay in. The hotel is not undergoing a remodel nor are they planning on any major construction during the conference (for those that attended the Overland Park conferences, this is important).
Friday, May 3, 2013
I have no idea how IBM decides what new features to add to Informix, but I do know that we can now be part of the discussion by using the new Request for Enhancement tool (RFE Tool).
I took this for a spin today and I must admit this is a pretty interesting thing that you should check out. Not only can you submit your own requests for new features, you can view what everyone else has suggested and vote for what you want to have added.
There are a lot of good ideas in there and I really hope to see some of these feature requests in later releases.
Take the RFE for a spin today. Submit a request, it is fun.
To see the Informix specific RFEs, search under Brand: Information Management, Product Family: Informix.
Monday, April 8, 2013
Friday, April 5, 2013
Sometimes you have to store a string of numbers in a CHAR column. Probably because the string of digits represents an account number or something similar and storing as an INTEGER or BIGINT doesn't really make sense. The account number could have leading zeros that would be lost if stored as an integer. Parts of the account number could store special information, like positions 2,3 and 4 identify what department an account belongs to and it might be useful to be able to select digit_string[2,4]. There are plenty of reasons to store numerical data in a string.
What is the best way to ensure that all of the characters in the string are actually numbers?
This is what I do, is there a better way to do it?
alter table my_table add constraint check (replace(rtrim(digit_string), " ", "x")::bigint >= 0) constraint my_table_ck1; insert into my_table (digit_string) values ("123456"); 1 row(s) inserted. insert into my_table (digit_string) values ("abc123"); 1213: A character to numeric conversion process failed insert into my_table (digit_string) values (" 123456"); 1213: A character to numeric conversion process failed update my_table set digit_string = "xyzpdq" where digit_string = "123456"; 1213: A character to numeric conversion process failed
The constraint will try to cast the digit string to a BIGINT, if this works then all of the characters in the string are numbers. If it doesn't work we get an SQL error and the bad data never makes it into our database.
The replace(rtrim()) stuff attempts to capture leading white space in the digit string that would not cause the cast to a BIGINT to fail.
There are plenty of other ways to accomplish the same thing, but I like this way.
You could rely on the application to check the digit string before it inserts/updates the database, but I'm pretty sure this isn't the best way.
You could write a stored procedure that is run by insert/update triggers, but I don't think that is more efficient than the check constraint/cast to BIGINT method. This would have the benefit of being able to raise a user defined SQL error instead of the odd -1213 error, though.
Posted by Andrew Ford at 3:44 PM
Tuesday, March 26, 2013
11:58 AM: Waiting for the IBM Informix It's Simply Powerful Webcast to start and on my screen I see IBM Informix 12.1, so I guess it is officially announced.
12:00 PM: Moderator is giving the rules and regulations of the Webcast. Questions will be answered after the Webcast.
12:01 PM: Chad Gates from Pronto Software, John Miller Informix Lead Architect and Sally Hartnell from IBM Marketing filling in for Jerry. Where's Jerry? He is unavoidably detained.
12:03 PM: 12+ Years of Informix Innovation with IBM
12:03 PM: Over 190 new partners in 2012
12:04 PM: Overview of the new stuff in 12.1. Cloud, Easy of Use, Warehouse, Sensor Data Management and something else I missed
12:05 PM: TimeSeries for Sensor Data. 5x Performance using 1/5 the resources as the competition
12:07 PM: Compression: Reduces storage and improves performance.
12:08 PM: JM3 talking about compression now. NEW! Index compression. NEW! Blob compression.
12:10 PM: NEW! Automatic table compression
12:11 PM: NEW! Primary Storage Manager replaces ISM for more backup solution options
12:12 PM: Chad from Pronto Software now talking about their EVP experience.
12:13 PM: Pronto has an ERP product that embeds Informix and Cognos. Informix initially picked for the OLTP capabilities. Informix 12.1 improves OLTP performance and OLAP performance benefiting from Informix Team working closely with the Cognos Team.
12:26 PM: Pronto experiences massive performance gain when concurrently running OLTP and OLAP on 12.1 over 11.x
12:28 PM: "Informix Warehouse Accelerator gaining worldwide traction to accelerate warehouse queries up to 100+ times"
12:29 PM: Back to JM3 on IWA improvements. NEW! Trickle Feed (cool) can now have real time analytics vs. refreshing the entire warehouse. NEW! Automated Partition Refresh. NEW! IWA and OAT integration.
12:31 PM: NEW! IWA and TimeSeries integration. IWA analytics over TimeSeries data.
12:32 PM: Flexible Grid/ER - NEW! ER no longer requires a Primary Key.
12:34 PM: Execute SQL over the grid - Query Sharding, that's sharding with a D.
12:35 PM: Talking about Hypervisor edition for Virtual/Cloud based deployments.
12:36 PM: Informix Genero accelerates new generation of mobile and cloud-based apps.
12:36 PM: Sally: Informix integrated with the IBM Mobile Database. Sync mobile db data with Informix backend.
12:37 PM: JM3: NEW! Mobile OAT for your phone or tablet
12:38 PM: Improved OAT out of the box experience, OAT GUI deployed as part of Informix install
12:39 PM: Sally: Smart Choice of ISVs and OEMs. Small footprint, silent install, up and running in minutes, 0 administration, autonomics. NEW! Dynamic ONCONFIG, Self Healing, Self Optimizing
12:40 PM: About to wrap up? Already? Oh, right. Q/A at the end. I want MOAR new features :)
12:41 PM: Bundling of Cognos licenses with new Advanced (Worgroup/Enterprise) Editions
12:41 PM: IIUG 2013 April 21-25, 2013 San Diego, CA
12:43 PM: Q/A starts.
12:43 PM: "Is compression available in Workgroup?" Sally says Compression included in Advanced Enterprise, available for purchase in Enterprise.
12:44 PM: "64 bit OAT?" JM3 says currently only 32 bit, but you can run 32 bit version on Windows 64 bit. Looking to have a 64 bit version for Windows in the future.
12:45 PM: "Is OAT faster in 12.1?" JM3 says ability to run update stats on sysmaster will allow OAT to run faster
12:46 PM: "Is Pronto using compression?" JM3 says no, perf gains are without compression
12:47 PM: "New tools to migration FROM Oracle?" JM3 says yes, a lot of technology added to assist in migrations.
12:48 PM: "Will Mobile OAT work with my 11.x server?" JM3 says yes
12:48 PM: "Where can I find more info about the new editions?" Sally says go to ibm.com/informix and view the new brochure. More detail: google Carlton Doe Informix Editions or google ibm software announcement 213-156
12:50 PM: "Any plans to do a benchmark?" Sally says the prefer industry specific real world benchmarks with their customers. Soon to publish a Meter Data Management benchmark.
12:52 PM: "Is ontape still supported?" JM3 says ontape and onbar still supported in 12.1. onbar just improved with PSM.
12:53 PM: "Can I get Congnos express bundled instead of the full Cognos?" Sally says no.
12:54 PM: "What do I need to do to use the compression features?" Sally says compression included in Advanced Enterprise, add on for Enterprise.
12:54 PM: "What is the #1 thing to remember from this webcast?" JM3 says the great improvements in OTLP/OLAP performance.
12:55 PM: "Is OAT built using a new version of PHP?" JM3 says yes, OAT uses a later version of PHP.
12:56 PM: "Tell us more about IBM Mobile" Sally says it is included with all for-pay versions of Informix and is a secure persistent storage for data on a device that allows backend syncronization to an Informix DB.
12:57 PM: "Can 12.1 replicate TimeSeries data?" JM3 says, yes TimeSeries can now be replicated via HDR/SDS/RSS, etc.
12:58 PM: Sally notes the great attendance to this Webcast and gives a shout out to IIUG 2013 (thanks Sally)
12:59 PM: End of Webcast, perfectly timed. Replay of webcast will be made available online.