Author Archive

DB2 for i › Dealing with Blank Checks in DB2

March 24th, 2015 Comments off
"I think IBM i developers believe they must help DB2..."

hmmm, unfortunately this is sad but true.

The science of database must be understood before the art of database can be practiced. This includes the relatively simple task of comparing a column's value to a literal.  How hard can it be to compare and test for blanks? You know, the spaces sitting in a empty not null character column.

Well, apparently it's more difficult on some platforms than others, and SQL coders jump through hoops and attempt back flips to "help" the database engine.

For more information and coaching on the topic, please see Kent Milligan's excellent article on the topic here.

If you want to get better at understanding the science of DB2 for i, and proficient at practicing the art of relational database with IBM i, then please reach out - we're here to help you become successful.

Read the original at DB2 for i.

Categories: Blogs Tags:

DB2 for i › Introducing DB2 Web Query DataMigrator!

February 27th, 2015 Comments off
IBM has announced a new product offering in the DB2 Web Query family portfolio.

DB2 Web Query DataMigrator ETL Extension (i.e. DataMigrator) provides extract, transformation, and load (ETL) capabilities integrated with DB2 Web Query and DB2 for i.

Users looking to isolate their query workloads from production environments, or needing to automate the consolidation of data from a variety of data sources will be interested in this solution.

DataMigrator allows you to automate the process of extracting data from any DB2 Web Query supported source database, the process of transforming or cleansing the data and loading the data into an optimized DB2 for i reporting repository. Depending on the design and use of that repository, the industry refers to this as an operational data store, data mart, or data warehouse.

While leveraging the same graphical look and feel of DB2 Web Query, there are also integration points with DB2 Web Query. For instance, synchronize meta data defined either in DataMigrator or DB2 Web Query.

For more information, refer to the following documents:

IBM US Announcement

IBM Europe, Middle East, and Africa Announcement

Frequently Asked Question document

If you need assistance with architecture, design or implementation of your analytics environment running in IBM i, please reach out - we are here to make you successful!

Read the original at DB2 for i.

Categories: Blogs Tags:

DB2 for i › Using the Blueprint

January 28th, 2015 Comments off
In the previous post, Dan Cruikshank illuminated the concept of creating a database "blueprint" through the process of data modeling.  Continuing my conversation with Dan, I asked him about using the blueprint, i.e. data model.  Specifically, using SQL to access the data (the answer is YES!) and whether or not it makes more sense to use embedded SQL in a traditional high level language like RPG, or to use separate, and modular Stored Procedures.

Dan's answer and insight follows...


The answer to the question as to use SQL Stored Procedures (or External Stored Procedures) versus embedded SQL in a monolithic program depends on the environment in which the application exists. What that means is if you are developing both host centric as well as external (i.e. web or mobile) applications which need to perform common IO functions, such as consuming result sets, inserting, updating and deleting rows from and to a common set of tables, etc., then using stored procedures would be my recommendation. If all development is strictly host centric (should not be in this day and age) then embedded SQL would be ok, but not best practice, in my opinion.

From an application methodology perspective, we tend to recommend a Model/View/Controller (MVC) approach for design and development, where the data access and user interface are separated from the high level language code. In the case of data access this would be done via stored procedure calls.

This is not a performance based recommendation; it is more about re-usability, reduced development costs and shorter time to market. Done properly, better performance is a byproduct. Not to mention higher value returned to the business!

In addition, with the added SQL DML support for result set consumption in IBM i 7.1, it is now easier for the host centric applications (i.e. RPG and COBOL) to share a common set of procedures which return result sets. From an SQL DML perspective, prior to 7.1 this was limited to insert, update and delete procedures although accessing a result set from a stored procedure was available via CLI calls (not widely used in traditional IBM i shops).


If you need additional knowledge, skills or guidance with what Dan is sharing, please do not hesitate to reach out.

Read the original at DB2 for i.

Categories: Blogs Tags:

DB2 for i › The Blueprint

January 16th, 2015 Comments off
This post is the result of a recent conversation about the importance of proper data modeling with my long time friend and teammate Dan Cruikshank.

For the readers who are not familiar with Dan, he is the person who invented the concept of database modernization and the methodology to pull it off in a responsible way; one of the many contributions Dan has provided to his appreciative clients around the world. We euphemistically refer to the IBM i database modernization process as "the Dan Plan".

Dan, the blog is yours...


About 15 years ago my wife and I jumped on an opportunity to buy a used home that was in foreclosure. The price was right, the location was fantastic; unfortunately the house was in rough shape.

There was no flooring in the main rooms; several of the walls had holes, who knew what the infrastructure was like.  We were looking at years of reconstruction and possibly 10’s of thousands of dollars in cost.

As we were going through the closets we discovered a set of the original blueprints. Suddenly years of work was now looking like weeks, and at a cost of 1’s of thousands of dollars.

About this same time my career at IBM took on a new slant. I was beginning to see that many of the performance issues I was then dealing with all seemed to be rooted around the same cause – a poor database design. IBM Rochester was launching the new SQE engine, which boasted brand new SQL capabilities that took advantage of the IBM i integrated relational data base "DB2 for i". Unfortunately many of the IBM i heritage customers were still using traditional record level access, let alone having a database that was properly designed for SQL set based access.

“Oh woe is me”, cried those customers who were now faced with a reconstruction nightmare – how to bring their applications and data into the new millennium without taking years of effort or spending millions of dollars on “modernization”. 

“If only we had been more diligent on documenting our applications”, lamented the growing number of CIOs who were now tasked with groveling for more budget dollars. “If only we had a blueprint!” they cried.

Never fear, there is a silver lining in this story. Hidden away, in a secret closet within the Rational suite of development tools, is something called the Data Perspective. The Data Perspective comes with Rational Business Developer, InfoSphere Data Architect and other bundled products; and it is included in the free (yes free) download of IBM Data Studio.

Within the Data Perspective is the Data Design Project tool. Using a Data Design Project, a database engineer can reverse engineer an existing set of tables (or DDS files) into a graphical physical data model, in other words a blueprint! This can be done for an entire schema (library) or for only those core files required to support the business.

But wait, there’s more.

Unlike the pencil drawings of yore, or the collection of shapes in a presentation tool, with a touch of a button the database engineer can generate the SQL Data Definition Language (DDL) statements from the physical data model. And, let me catch my breath, the DDL will be generated no matter if the originating source was DDS or SQL DDL. That is too cool.

And I almost left out the best part – the compare tool.

Imagine if I could have taken those original blueprints of the house, changed them and then pushed a button and my home would magically be transformed to match the blueprint. Not possible with home re-engineering projects but it is available with the Data Perspective. I can compare the graphical model or blueprint to the physical implementation and the tool will generate the appropriate ALTER, DROP and/or CREATE DDL statements, in either direction. I can apply emergency changes to the DB2 for i database and then sync up the model.

Of course having a blueprint is one thing, getting the re-engineering process right is another.

All projects are going to require some boots on the ground. In other words, the developers who have to make the changes. These "engineers" will require a little more detail, especially when ripping apart existing programs to expose the inner workings.

Oh joy, there is another secret closet in the Rational Developer tool box – The Visual Application Diagram maker. This device can take RPG or COBOL code and present it in graphical form. And what’s more, the engineer can click anywhere in the diagram to display the underlying code.
Whether you’re a database engineer or application developer, it is now time for you to take your skills to the next level. If you are not using the Rational tools then acquire them now. If you are using them, then don’t be afraid to explore some of those secret closets.

And if you're afraid of the dark, please reach out, we can provide some hand holding.

Read the original at DB2 for i.

Categories: Blogs Tags:

DB2 for i › Resolve to Gain New Knowledge and Skill

December 18th, 2014 Comments off
As we approach the end of 2014 and look forward to next year, I want to encourage you to take stock of your knowledge and skill as it pertains to DB2 for i and data centric design and programming.

Your value, to employers, clients, partners, and colleagues will diminish over time unless you "sharpen the saw".

To that end, I want to call your attention to a couple public classes we are offering in Rochester, Minnesota (aka: the home office).


DB2 for i Advanced SQL and Data Centric Programming

2015, February 17-19 

Skills taught


DB2 for i SQL Performance Monitoring, Analysis and Tuning

2015, February 24-27 

Skills taught


If you need more detail on what the respective classes cover, or why the knowledge and skill are critical success factors in the world of database engineering, please contact me. And if you want to explore a private and/or customized session, we can assist there as well.

“Give me six hours to chop down a tree and I will spend the first four sharpening the axe.”

 ― Abraham Lincoln

Read the original at DB2 for i.

Categories: Blogs Tags:

DB2 for i › Trust, but Verify

October 8th, 2014 Comments off
I am often asked about the risk of migrating to a new version/release of IBM i; "should we go to 7.1 or 7.2"?

The same can be said about moving to the latest technology refresh (7.1 TR9 and 7.2 TR1 were just announced by the way).

I prefer to talk about the rewards of keeping up with the advances in technology - keep the tool belt fully stocked so to speak.

So, should you install the latest and greatest?  My usual answer is "yes, and...".

Whether you are configuring new hardware, putting on a new set of group PTFs, installing the latest TR or migrating to IBM i 7.2, my sincere advice is based on an old Russian proverb:

Trust, but Verify

What this really means is, YOU should be testing the new hardware, testing the group PTFs, verifying the TR code or the latest version of IBM i.  And I don't mean give it a spin for a few days on the development system.  I'm talking about proper and adequate testing; a real verification of the features and functions. Find out for yourself, do they behave as advertised?

Now here is the issue...  proper and adequate testing must be based on science, and some art.

SCIENCE, as in, using the scientific method:

  • Purpose or Question
  • Research
  • Hypothesis
  • Experiment
  • Analysis
  • Conclusion

And ART, as in - you have to be clever about how, when and where you apply the science.  If you are not testing the business processes that produce the transactions occurring in the production environment, you are not actually verifying anything, nor are you mitigating any risk. You are just fooling yourself.  And if you cannot pin down the variables and repeat the process consistently, the experiment will be inconclusive, and a waste of time.  I don't know how many times I have been in awkward conversations that go something like this:

DB2 user: "we just upgraded, my queries don't run fast anymore"

Mike: "I'm sorry to hear this... do you have any information captured about how the queries were running prior to the upgrade"?

DB2 user: "no"

Mike: "can you tell me how the queries were running prior to the upgrade"?

DB2 user: "yes, fast"

Mike: *heavy sigh*

When it comes to DB2 data integrity and data processing, three fundamental things need to be tested and verified:

  1. Correct results
  2. Performance
  3. Scalability

Correct results is obvious - did my request or process produce the expected answer or result?

Performance gets a lot of attention - did my request or process complete in the time expected?

Scalability is much more difficult to understand - did my request or process complete in the time expected when running with the full data set and under the stress of all the normal production activity?

My recommendation is that you get in a position to test (and verify!) that the new hardware and/or software meets your requirements BEFORE implementing anything in the production environment.  And with that said, verify your rollback strategy if something does slip by.

When it comes to testing and verifying DB2 for i, the point person should be your database engineer. If you don't have one, now is a good time to establish the position, install the candidates, and provide training and support. Don't forget to give them clear responsibility and authority to do the job.

If you don't have, or don't want to invest in a full fledged testing environment, or you want the subject matter experts to look over your shoulder, make a visit to IBM Rochester, Minnesota and embark on a performance and scalability engagement.

If you would like to discuss the science and art of verifying DB2 for i, please contact me.  We are here to help you be successful, and to mitigate risk.

Read the original at DB2 for i.

Categories: Blogs Tags:

DB2 for i › Ready or Not?

September 5th, 2014 Comments off
The following message is brought to you by senior level consultant, long time member of the IBM DB2 team, and my good friend Mr. Jim Denton...


Ready or not …

Here it comes.  Change is one of constants in our lives as IT professionals – we are often confronted by new business requirements,  increasing data and transaction volumes, shifting strategic priorities, opportunities to leverage analytics, data model limitations, long running or erratic queries, new user interface demands, integrating new applications, managing business consolidations and acquisitions, etc.  And practically everyone will have additional challenges to add to that list.

As part of the DB2 for i Center of Excellence team, I often get the opportunity to teach both the SQL Performance and the Advanced SQL courses.  Practically every session of those courses includes students saying “I wish I’d known.”  They’ve spent valuable time writing a complicated program instead of just a few carefully crafted SQL statements.  They’ve rewritten code to improve performance without knowing there were several easier options.  Or they’ve used SQL without leveraging SQL’s strengths for set-based processing.

Clearly, knowledge is our best defense against an onslaught of change.   So what can you do to prepare yourself?  Here are some ideas:

  • Keep reading blogs like this one!
  • Take advantage of the DB2 for i homepage and in particular, the vast array of whitepapers on the Resources tab

We understand your frustration when we are forced to cancel public workshops due to low enrollments.  Ask us about a private session.  It is probably less expensive than you think and we can train more members of your team at the same time.  We can also customize content and focus on what’s important to you. 

One last point -- I highly recommend that you always take credit for your successes.  This can be anything from talking to your colleagues at the coffee machine to executive level presentations.  This is no time to be modest!  Taking credit builds organizational momentum and demonstrates the value of ongoing investments in DB2 for i skills.

Your company has made significant business investments in IBM i and you have a long list of technical challenges which require new knowledge and new skills.  Take a minute and ask yourself:

Are you ready?

Read the original at DB2 for i.

Categories: Blogs Tags:

DB2 for i › Governance and Control? Show Me

August 27th, 2014 Comments off
In the state of Louisiana where I was born and spent some formative years, they often repeat a famous quote: "it ain't braggin' if you can do it".

And in the state of Missouri where I went to college, the unofficial motto is "show me". 

"Show me, do it".  There's no quicker way to separate fact from fiction, truth from falsehood, or request a demonstration. And if all goes well, you might learn something; or at least be entertained!

With that in mind, I am excited to announce the results of a recent project that clearly demonstrates one of the major new database enhancements delivered in IBM i 7.2.

Our DB2 for i Center of Excellence team partnered with the IBM International Technical Support Organization (think Redbooks) and the smart, creative folks in IBM i Development to illuminate and exercise Row and Column Access Control. The fruit of this labor is an ITSO Redpaper describing design, definition and implementation scenarios of DB2 row permissions and column masks.


Show Me

If you are interested in establishing effective policies and gaining control of row and column access in a truly data centric way, I want you to do three things:

1. Go here and watch the brief video explaining the situation, then share it with your business leaders - especially the owners of the data.  There is a version in Spanish as well as in English.

2. Go here and download the DB2 for i Row and Column Access Control Redpaper.  Read it.  Think about it.  Generate awareness in your organization.

3. Identify an executive champion - someone who will have to answer the phone when sensitive data lands on the wrong desk.  Initiate a project with assistance from the IBM DB2 for i Center of Excellence team.  In other words, point out the problem (unfettered access to all the data), and then suggest the solution (RCAC).

As always, feel free to contact me if you need help with articulating the situation at any level of your organization.

And remember, data is valuable. Please protect it.

Read the original at DB2 for i.

Categories: Blogs Tags:

DB2 for i › DB2 for i… OPEN for Business

May 13th, 2014 Comments off
If you have been following along for a while, I am sure you have come to realize that DB2 for i is one of the most open data centric platforms around.

Whether you are accessing:

  • via embedded SQL (static OR dynamic - you choose)
  • connecting via ODBC or JDBC
  • via call level interface (CLI)
  • from PHP running in IBM i
  • via the tried and true, standard connectivity mechanism known as DRDA

you should be able to get to your data where ever you are, while restricting "others" from doing so.

Mr. Milligan (technical writer extraordinaire) has recently authored an article on accessing DB2 for i from a Linux client.  This is a very important topic given the proliferation of Linux in the market place. Keeping your data safely within the confines of DB2 while being able to make use of it from IBM i, AIX, Linux or even Windows is the secret to success.

You can check out Kent's article beginning on page 13 in the latest issue of iPro Developer.

And of course, if you need any assistance with getting connected, Kent and the team are here to help give you an introduction.

Read the original at DB2 for i.

Categories: Blogs Tags:

DB2 for i › With IBM i 7.2, DB2 Gives You More!

April 28th, 2014 Comments off
Today (Monday April 28) is the announcement of the long awaited IBM i version 7 release 2. And if you've been following along, you know this also means that the fully integrated DB2 for i moves forward as well.

Now, before illuminating a new powerful database capability residing in the best OS for business, let me remind you that over the past couple years, the good folks who develop DB2 for i have been enhancing the database capabilities within 7.1 too. These enhancements are known as "technology refreshes" or TRs. The latest iteration, TR8 was announced a few weeks ago - check it out here.

A Data Centric Attitude

For years now, my team has been going around the world educating clients and solution providers on the importance of designing and developing business applications with a data centric attitude. That is to say, employing sound relational database techniques, set-at-a-time principles and information management best practices. After all, this is how to get the most out of DB2 for i, now and in the future.

Architecturally speaking, we want to see applications that exhibit flexibility, extensibility and scalability. And obviously, business users want solutions in a timely and cost effective manner.

Utilizing modern tools and modern methods is the fastest and most efficient way to achieve this goal. Whether creating a new application or re-engineering an existing one, progressive techniques and using SQL are the keys to unlocking the valuable features and functions residing in the database management system woven into IBM i.

You Need Governance and Control 

As I engage clients in various settings, one topic that always seems to pop up is the need for governance and control. Unless you have been keeping your eyes closed and ears plugged, you realize that data is big. More accurately: storing, analyzing and exploiting data in ways that bring new insight and understanding is all the rage.

As the appetite for data increases, so does the importance of governance and control. Determining who is able to see what data is the responsibility of the data owner, and it is a fundamental aspect of designing with integrity and deploying with confidence. It might also keep you out of the spotlight.

In the old days of AS/400, nearly all of the data access was controlled through the high level language 5250 programs that sat at the heart of online transactional systems. To get to a program and ultimately to the data, the user had to sign on with a valid user profile and corresponding password. If successful, a menu was presented - their menu. The list of items on that menu dictated what the user could do and not do; what data they had access to, and did not have access to. In many cases, the underlying database objects were not secured and not restricted through object level security.  Why should they be? The only way to access the object was through the program, and the only way to call the program was through the menu.

Given the openness of DB2 for i, I trust you now realize that there are many different ways for a user to gain access to data outside of the old style menu based applications. For example, tools and applications that connect via ODBC and JDBC interfaces abound. The need to properly secure database objects directly via the powerful IBM i security features is a must. If I walk into your shop and try to access your physical files with SQuireL, will I gain access?  If you don't know, or are not sure, we need to talk. Seriously.

Unfortunately, granting or revoking rights to the data seems to be an all or nothing proposition. If a user has access to the table, they have access to all the rows. If a user does not have access to the table, they have access to no rows. What if different departments all need access to the same table, but each department must be restricted to a subset of rows and columns that they are authorized to see? Is there a way to allow each group of users access to only their respective data sets?

Vistas Grande

Within the science and art of relational database there exists the ability to provide a logical "view" of the data. A VIEW is a virtual table, and as such, the database engineer can define and publish various views of the same data. This technique can be used to provide different groups of users with different sets of rows that they can "see". The trick is to ensure that the user is only accessing their particular VIEW and no other.

For a relative small and stable set of different users, creating and utilizing VIEWs is an elegant and acceptable way to control access to data, whether it be a particular set of rows or a subset of columns.

The VIEW is also very useful when needing to transform data and/or hide complexity, such as join syntax or aggregation specifications. But what if the different groups of users is relatively large and dynamic?  What if there are many different applications and database interfaces in use? Implementing a comprehensive and grand set of views can be problematic and time consuming.

Behold, Another Powerful Tool in the Kit

Through the use of data centric (not application centric) techniques, the row and column data that is returned to the requester can be controlled and governed by a set of policies defined and implemented within DB2 for i. These policies cannot be bypassed or circumvented by the requester, and they are in effect regardless of interface.

The new capability delivered with 7.2 is known as Row and Column Access Control or RCAC.

RCAC provides fine grained access control and is complementary to the ever present object level security (i.e. table privilages). With the new row and column access control feature of DB2 for i, the database engineer, in partnership with the data owner can ensure that users see only the data that is required for their work, and return result sets that match their level of authorization. This can (and should) also include allowing the database engineer to design and implement the policies, but restricting he or she from the actual data the policies control. In other words, just because you implemented the database security mechanism, it doesn't mean you have access to all the data.

Some of the advantages of RCAC are:

  • No database user is inherently exempted from the row and column access control policies.
  • Table data is protected regardless of how the table is accessed.
  • No application changes are required to take advantage of this additional layer of data security.
  • Both rows and/or columns can be controlled through simple or complex logic - including the ability to mask what data is projected.
  • Groups of users can share the same policy - or not.
  • The implementation of the policies is part of the DB2 data access layer itself.

Seek to Understand, Then Plan and Test

Like any advanced technique, deep understanding, proper planning, and adequate testing are essential for success. This includes the use of quality assurance, as well as performance and scalability exercises that serve to demonstrate all of your requirements are being met. Proofs of concepts and proofs of technology are highly recommended. These projects can be accomplished in Rochester, Minnesota by the way.

To further assist you with understanding and insight, the DB2 for i Center of Excellence team will be partnering with the ITSO in June and July to author a "redpaper". The document will cover more details on successfully implementing RCAC. Stay tuned for that.

In the mean time, if you are interested in getting more value out of IBM i data centric technology immediately, please reach out, we are here to help.

And finally, check out Kent Milligan's excellent overview of what's new in DB2 for i - including more technical details on row column access control. The presentation can be found here.

Read the original at DB2 for i.

Categories: Blogs Tags: