Tuesday, March 26, 2013

For My 100th Post: IBM Informix 12.1 - It's Simply Powerful

11:58 AM: Waiting for the IBM Informix It's Simply Powerful Webcast to start and on my screen I see IBM Informix 12.1, so I guess it is officially announced.

12:00 PM: Moderator is giving the rules and regulations of the Webcast. Questions will be answered after the Webcast.

12:01 PM: Chad Gates from Pronto Software, John Miller Informix Lead Architect and Sally Hartnell from IBM Marketing filling in for Jerry. Where's Jerry? He is unavoidably detained.

12:03 PM: 12+ Years of Informix Innovation with IBM

12:03 PM: Over 190 new partners in 2012

12:04 PM: Overview of the new stuff in 12.1. Cloud, Easy of Use, Warehouse, Sensor Data Management and something else I missed

12:05 PM: TimeSeries for Sensor Data. 5x Performance using 1/5 the resources as the competition

12:07 PM: Compression: Reduces storage and improves performance.

12:08 PM: JM3 talking about compression now. NEW! Index compression. NEW! Blob compression.

12:10 PM: NEW! Automatic table compression

12:11 PM: NEW! Primary Storage Manager replaces ISM for more backup solution options

12:12 PM: Chad from Pronto Software now talking about their EVP experience.

12:13 PM: Pronto has an ERP product that embeds Informix and Cognos. Informix initially picked for the OLTP capabilities. Informix 12.1 improves OLTP performance and OLAP performance benefiting from Informix Team working closely with the Cognos Team.

12:26 PM: Pronto experiences massive performance gain when concurrently running OLTP and OLAP on 12.1 over 11.x

12:28 PM: "Informix Warehouse Accelerator gaining worldwide traction to accelerate warehouse queries up to 100+ times"

12:29 PM: Back to JM3 on IWA improvements. NEW! Trickle Feed (cool) can now have real time analytics vs. refreshing the entire warehouse. NEW! Automated Partition Refresh. NEW! IWA and OAT integration.

12:31 PM: NEW! IWA and TimeSeries integration. IWA analytics over TimeSeries data.

12:32 PM: Flexible Grid/ER - NEW! ER no longer requires a Primary Key.

12:34 PM: Execute SQL over the grid - Query Sharding, that's sharding with a D.

12:35 PM: Talking about Hypervisor edition for Virtual/Cloud based deployments.

12:36 PM: Informix Genero accelerates new generation of mobile and cloud-based apps.

12:36 PM: Sally: Informix integrated with the IBM Mobile Database. Sync mobile db data with Informix backend.

12:37 PM: JM3: NEW! Mobile OAT for your phone or tablet

12:38 PM: Improved OAT out of the box experience, OAT GUI deployed as part of Informix install

12:39 PM: Sally: Smart Choice of ISVs and OEMs. Small footprint, silent install, up and running in minutes, 0 administration, autonomics. NEW! Dynamic ONCONFIG, Self Healing, Self Optimizing

12:40 PM: About to wrap up? Already? Oh, right. Q/A at the end. I want MOAR new features :)

12:41 PM: Bundling of Cognos licenses with new Advanced (Worgroup/Enterprise) Editions

12:41 PM: IIUG 2013 April 21-25, 2013 San Diego, CA

12:43 PM: Q/A starts.

12:43 PM: "Is compression available in Workgroup?" Sally says Compression included in Advanced Enterprise, available for purchase in Enterprise.

12:44 PM: "64 bit OAT?" JM3 says currently only 32 bit, but you can run 32 bit version on Windows 64 bit. Looking to have a 64 bit version for Windows in the future.

12:45 PM: "Is OAT faster in 12.1?" JM3 says ability to run update stats on sysmaster will allow OAT to run faster

12:46 PM: "Is Pronto using compression?" JM3 says no, perf gains are without compression

12:47 PM: "New tools to migration FROM Oracle?" JM3 says yes, a lot of technology added to assist in migrations.

12:48 PM: "Will Mobile OAT work with my 11.x server?" JM3 says yes

12:48 PM: "Where can I find more info about the new editions?" Sally says go to ibm.com/informix and view the new brochure. More detail: google Carlton Doe Informix Editions or google ibm software announcement 213-156

12:50 PM: "Any plans to do a benchmark?" Sally says the prefer industry specific real world benchmarks with their customers. Soon to publish a Meter Data Management benchmark.

12:52 PM: "Is ontape still supported?" JM3 says ontape and onbar still supported in 12.1. onbar just improved with PSM.

12:53 PM: "Can I get Congnos express bundled instead of the full Cognos?" Sally says no.

12:54 PM: "What do I need to do to use the compression features?" Sally says compression included in Advanced Enterprise, add on for Enterprise.

12:54 PM: "What is the #1 thing to remember from this webcast?" JM3 says the great improvements in OTLP/OLAP performance.

12:55 PM: "Is OAT built using a new version of PHP?" JM3 says yes, OAT uses a later version of PHP.

12:56 PM: "Tell us more about IBM Mobile" Sally says it is included with all for-pay versions of Informix and is a secure persistent storage for data on a device that allows backend syncronization to an Informix DB.

12:57 PM: "Can 12.1 replicate TimeSeries data?" JM3 says, yes TimeSeries can now be replicated via HDR/SDS/RSS, etc.

12:58 PM: Sally notes the great attendance to this Webcast and gives a shout out to IIUG 2013 (thanks Sally)

12:59 PM: End of Webcast, perfectly timed. Replay of webcast will be made available online.

Thursday, March 7, 2013

Psuedo strtok in SPL

I needed a way to extract the individual words from a sentence stored in a single character field. After some failed google searches and no desire to install a Datablade or write a C UDR for something that doesn't need to have killer performance, I decided to write my own quick and dirty SPL function.

my_strtok(str, delim, token_num) will take a string, break it into individual tokens based the delimiter and return the Nth token of the string.

Running this:

execute function my_strtok("How now brown cow", " ", 3)

Would return the third token:


Here is the code for my_strtok(), comments welcome on anything I might have missed in the logic. And when I say it is slow, I just mean it could be done in a different way and perform more efficiently, but for what I needed it works.

create function my_strtok (str lvarchar(2048), delim char(1), token_num smallint)
returning lvarchar(2048) as token;

        define str_len integer;
        define start_pos integer;
        define stop_pos integer;
        define cur_token_num integer;

        -- initialize start position and current token number to 1
        let start_pos = 1;
        let cur_token_num = 1;

        -- remove any leading delimiters from the input string
        let str = ltrim(str, delim);

        -- save the input string length so we don't have to recalculate it later
        let str_len = length(str);

        -- find the start of the token we want to return

        -- while there is still more string available to process
        while (start_pos <= str_len)
                -- if the current token number is the token we want, stop looking
                -- for a start position
                if (cur_token_num = token_num) then
                end if;

                -- increment the start position to the next character
                let start_pos = start_pos + 1;

                -- check to see if the current character in the string is a delimiter
                if (substr(str, start_pos, 1) = delim) then
                        -- we have found the next token
                        let cur_token_num = cur_token_num + 1;

                        -- advance the token start position past any repeating delimiters
                        while (start_pos <= str_len)
                                let start_pos = start_pos + 1;

                                if (substr(str, start_pos, 1) != delim) then
                                        -- there are no more repeating delimiters
                                        -- stop looking for repeating delimiters
                                end if;
                        end while;
                end if;
        end while;

        -- we now either have the start position of the token we are looking for
        -- or we did not find the token we were looking for
        -- if we did not find the token, return NULL
        -- if we did find the token we were looking for, find the end of the token

        if (cur_token_num = token_num) then
                -- we found the token
                let stop_pos = start_pos;

                -- while there is still string to process try to find the end of our token
                -- if we run out of string before we find the next delimiter then
                -- our token ends where the string ends
                while (stop_pos <= str_len)
                        let stop_pos = stop_pos + 1;

                        if (substr(str, stop_pos, 1) = delim) then
                                -- we found the end
                                let stop_pos = stop_pos - 1;
                        end if;
                end while;

                -- return the found token
                return substr(str, start_pos, stop_pos - start_pos + 1);
                -- the token was not found
                return NULL;
        end if;
end function;

execute function my_strtok("Simple test", " ", 1);

token  Simple

1 row(s) retrieved.

execute function my_strtok("Simple test", " ", 2);

token  test

1 row(s) retrieved.

execute function my_strtok("    Leading delimiters", " ", 1);

token  Leading

1 row(s) retrieved.

execute function my_strtok("Repeating       delimiters", " ", 2);

token  delimiters

1 row(s) retrieved.

execute function my_strtok("Token not found", " ", 4);


1 row(s) retrieved.

execute function my_strtok("Should have checked for invalid input", " ", -1);


1 row(s) retrieved.

execute function my_strtok("Invalid input works, but is unecessarily slow", " ", -1000);


1 row(s) retrieved.

execute function my_strtok("Empty delimiter defaults to space, convenient", "", 6);

token  convenient

1 row(s) retrieved.

Tuesday, March 5, 2013

Where Are They Now?

Where did I find this picture, can you identify anyone in this picture and what is the guy on the left looking at on the ground?


Monday, March 4, 2013

From the SIGs - LTXHWM and LTXEHWM

John Adamski posted a question to the IIUG SIGs about how to identify a session that caused the long transaction that eventually put his system in a Blocked:LONGTX state. A few of us came back with some responses, but it wasn't until John Miller III from IBM and Informix Fun Facts replied with "finding the session that caused your long transaction isn't very useful, you need to prevent this situation from happening with the LTXHWM and LTXEHWM ONCONFIG parameters" that I realized these config parameters are typically underutilized.

Friday, March 1, 2013


"Holy Cow, two blog posts in one day!" - Harry Caray

Ben Thompson over at Informed Mix currently wrote about using "select for update/where current of" syntax and in the mother of all coincidences one of the developers that writes code that hits my Informix engines came over to tell me about the evolution of performance improvements he went through to speed up a bulk data delete application. Here is his story, from static SQL all the way to prepared statements using the "select for update/where current of" syntax.

Informix News of the Weird

Informix Engine Defies Laws of the Universe

I recently inherited an Informix engine from another department as part of a server migration and upgrade to Innovator-C. In my life long pursuit of finding unattended Informix engines with ridiculously long up times I did an 'onstat -' to see how many days this engine had been running.

[informix]$ onstat -

IBM Informix Dynamic Server Version 10.00.UC3R1   -- On-Line (Prim) -- Up 869 days 12:02:13 -- 2423888 Kbytes

Hey, 2.38 years without an engine bounce, not too shabby! When I started to brag about the resiliency of Informix to the previous owner of this engine who deals mostly with other database engines he came back with, "Well, that IS pretty amazing considering the server was rebooted a year ago"

[informix]$ uptime
 09:16:11  up 372 days,  9:01,  2 users,  load average: 0.30, 0.41, 0.43

Informix really is amazing. Show me any other engine that can continue to run when the server is offline. What is more impressive is this was version 10, I can only imagine what the Informix Developers have in store for us in Centaurus.