Archive

Author Archive

DB2 for i › DB2 for i… OPEN for Business

May 13th, 2014 Comments off
If you have been following along for a while, I am sure you have come to realize that DB2 for i is one of the most open data centric platforms around.

Whether you are accessing:

  • via embedded SQL (static OR dynamic - you choose)
  • connecting via ODBC or JDBC
  • via call level interface (CLI)
  • from PHP running in IBM i
  • via the tried and true, standard connectivity mechanism known as DRDA

you should be able to get to your data where ever you are, while restricting "others" from doing so.

Mr. Milligan (technical writer extraordinaire) has recently authored an article on accessing DB2 for i from a Linux client.  This is a very important topic given the proliferation of Linux in the market place. Keeping your data safely within the confines of DB2 while being able to make use of it from IBM i, AIX, Linux or even Windows is the secret to success.

You can check out Kent's article beginning on page 13 in the latest issue of iPro Developer.

And of course, if you need any assistance with getting connected, Kent and the team are here to help give you an introduction.


Read the original at DB2 for i.

Categories: Blogs Tags:

DB2 for i › With IBM i 7.2, DB2 Gives You More!

April 28th, 2014 Comments off
Today (Monday April 28) is the announcement of the long awaited IBM i version 7 release 2. And if you've been following along, you know this also means that the fully integrated DB2 for i moves forward as well.

Now, before illuminating a new powerful database capability residing in the best OS for business, let me remind you that over the past couple years, the good folks who develop DB2 for i have been enhancing the database capabilities within 7.1 too. These enhancements are known as "technology refreshes" or TRs. The latest iteration, TR8 was announced a few weeks ago - check it out here.

A Data Centric Attitude


For years now, my team has been going around the world educating clients and solution providers on the importance of designing and developing business applications with a data centric attitude. That is to say, employing sound relational database techniques, set-at-a-time principles and information management best practices. After all, this is how to get the most out of DB2 for i, now and in the future.

Architecturally speaking, we want to see applications that exhibit flexibility, extensibility and scalability. And obviously, business users want solutions in a timely and cost effective manner.

Utilizing modern tools and modern methods is the fastest and most efficient way to achieve this goal. Whether creating a new application or re-engineering an existing one, progressive techniques and using SQL are the keys to unlocking the valuable features and functions residing in the database management system woven into IBM i.

You Need Governance and Control 


As I engage clients in various settings, one topic that always seems to pop up is the need for governance and control. Unless you have been keeping your eyes closed and ears plugged, you realize that data is big. More accurately: storing, analyzing and exploiting data in ways that bring new insight and understanding is all the rage.

As the appetite for data increases, so does the importance of governance and control. Determining who is able to see what data is the responsibility of the data owner, and it is a fundamental aspect of designing with integrity and deploying with confidence. It might also keep you out of the spotlight.

In the old days of AS/400, nearly all of the data access was controlled through the high level language 5250 programs that sat at the heart of online transactional systems. To get to a program and ultimately to the data, the user had to sign on with a valid user profile and corresponding password. If successful, a menu was presented - their menu. The list of items on that menu dictated what the user could do and not do; what data they had access to, and did not have access to. In many cases, the underlying database objects were not secured and not restricted through object level security.  Why should they be? The only way to access the object was through the program, and the only way to call the program was through the menu.

Given the openness of DB2 for i, I trust you now realize that there are many different ways for a user to gain access to data outside of the old style menu based applications. For example, tools and applications that connect via ODBC and JDBC interfaces abound. The need to properly secure database objects directly via the powerful IBM i security features is a must. If I walk into your shop and try to access your physical files with SQuireL, will I gain access?  If you don't know, or are not sure, we need to talk. Seriously.

Unfortunately, granting or revoking rights to the data seems to be an all or nothing proposition. If a user has access to the table, they have access to all the rows. If a user does not have access to the table, they have access to no rows. What if different departments all need access to the same table, but each department must be restricted to a subset of rows and columns that they are authorized to see? Is there a way to allow each group of users access to only their respective data sets?

Vistas Grande


Within the science and art of relational database there exists the ability to provide a logical "view" of the data. A VIEW is a virtual table, and as such, the database engineer can define and publish various views of the same data. This technique can be used to provide different groups of users with different sets of rows that they can "see". The trick is to ensure that the user is only accessing their particular VIEW and no other.

For a relative small and stable set of different users, creating and utilizing VIEWs is an elegant and acceptable way to control access to data, whether it be a particular set of rows or a subset of columns.

The VIEW is also very useful when needing to transform data and/or hide complexity, such as join syntax or aggregation specifications. But what if the different groups of users is relatively large and dynamic?  What if there are many different applications and database interfaces in use? Implementing a comprehensive and grand set of views can be problematic and time consuming.

Behold, Another Powerful Tool in the Kit


Through the use of data centric (not application centric) techniques, the row and column data that is returned to the requester can be controlled and governed by a set of policies defined and implemented within DB2 for i. These policies cannot be bypassed or circumvented by the requester, and they are in effect regardless of interface.

The new capability delivered with 7.2 is known as Row and Column Access Control or RCAC.

RCAC provides fine grained access control and is complementary to the ever present object level security (i.e. table privilages). With the new row and column access control feature of DB2 for i, the database engineer, in partnership with the data owner can ensure that users see only the data that is required for their work, and return result sets that match their level of authorization. This can (and should) also include allowing the database engineer to design and implement the policies, but restricting he or she from the actual data the policies control. In other words, just because you implemented the database security mechanism, it doesn't mean you have access to all the data.

Some of the advantages of RCAC are:

  • No database user is inherently exempted from the row and column access control policies.
     
  • Table data is protected regardless of how the table is accessed.
     
  • No application changes are required to take advantage of this additional layer of data security.
     
  • Both rows and/or columns can be controlled through simple or complex logic - including the ability to mask what data is projected.
     
  • Groups of users can share the same policy - or not.
     
  • The implementation of the policies is part of the DB2 data access layer itself.
 

Seek to Understand, Then Plan and Test


Like any advanced technique, deep understanding, proper planning, and adequate testing are essential for success. This includes the use of quality assurance, as well as performance and scalability exercises that serve to demonstrate all of your requirements are being met. Proofs of concepts and proofs of technology are highly recommended. These projects can be accomplished in Rochester, Minnesota by the way.

To further assist you with understanding and insight, the DB2 for i Center of Excellence team will be partnering with the ITSO in June and July to author a "redpaper". The document will cover more details on successfully implementing RCAC. Stay tuned for that.

In the mean time, if you are interested in getting more value out of IBM i data centric technology immediately, please reach out, we are here to help.

And finally, check out Kent Milligan's excellent overview of what's new in DB2 for i - including more technical details on row column access control. The presentation can be found here.


Read the original at DB2 for i.

Categories: Blogs Tags:

DB2 for i › DB2 for i Technical Forums

March 7th, 2014 Comments off
Hola! My teammate and amigo Hernando Bedoya (he is actually familia) and I are just wrapping up two very successful DB2 for i Technical Forums in Bogota, Colombia (week of Feb 24) and Bilbao, Spain (week of Mar 3). Hernando was an excellent guide and translator. (you see, my Spanish is virtually nonexistent, and based in part on watching Speedy Gonzales cartoons on the TV as a child).

First and foremost I want to extend my sincere thanks and appreciation to our event partners...

In Bogota, Mr. Octavio Bustos of Redsis arranged and hosted the Forum for his clients in Colombia.

In Bilbao, Mr. Igor Izaguirre of Trentinort arranged and hosted the Forum for his clients in Spain.

The opportunity, and our success is due in large measure to these good folks.  Muchas gracias!

___


The history of the DB2 for i Technical Forum begins 8 years ago on a long and scenic drive from the small village of Agordo in the Italian Alps to the port city of Venice.

You see, the famous and world wide ITSO AS/400 Technical Forums originally conceived by the one and only Ian Jarman were no more, and the IBM conduits to share accurate and current technical guidance about all things OS/400 were shriveling up.

During the drive out of the mountains, my long time IBM colleague and dear friend Simona Pacchiarini was wondering aloud how we continue to spread the word about database. In a moment of inspiration (or maybe lunacy) we hit upon the idea of creating our own "Technical Forum" - expressly for the purpose of sharing and discussing database topics. Given that some of the philosophical and architectural concepts of "forum" can be traced back to Italy, we felt compelled to initiate this idea in Milan, and we have continue the tradition on an annual basis ever since.
(and what is the real reason my DB2 for i Technical Forums are in Italy every year? Simona is a bulldog when it comes to getting good things done!)

___


I have used the Forums in Italy as a proving ground when developing techniques that serve to illuminate the DB2 for i features, functions and unique advantages. During last year's event I tested a new theme entitled "Design It, Build It, Tune It". Of course, this topic is in addition to the standard update on what's new in DB2. Given the success of the presentations and lab exercises, we are continuing the theme in 2014.

The simple notion behind the sessions on design / build / tune involves the importance of sharing knowledge, teaching skills and discussing best practices for data centric applications, in a holistic way.

"Design It" focuses on the science and art of data modeling - something that is universal, but sorely lacking in IBM i shops. We present an overview of conceptual models, logical models and physical models which serve as the blueprint for communicating the database architecture and design.

"Build It" focuses on implementing a physical model, thinking in sets (instead of records), and the elements of set-at-a-time processing. The SQL components of DDL, DML and PSM are used to demonstrate the concepts and used to construct the data centric application (i.e. tables, views, indexes, queries, procedures).

"Tune It" focuses on techniques for increasing efficiency, performance and scalability. We present an overview and discuss the IBM i integrated toolset for monitoring, analyzing and tuning SQL requests.

___


I believe the biggest value of the DB2 Forum comes from the final presentation whereby all of the features and functions are woven into a realistic business problem/solution scenario. The practical demonstration, the putting into action all of the concepts and best practices, really shows the audience a clear path forward in terms of data centric development technique.

In the spirit of a forum, our 3 day event always includes lively dialogs, discussions and debates on all things database.

If you are interested in bringing the DB2 for i Technical Forum to your clients or organization, please reach out. Or if you are in need of specialized / focused assistance with your data centric project, we are here to help.

Adios



Read the original at DB2 for i.

Categories: Blogs Tags:

DB2 for i › The ABC of Social

February 21st, 2014 Comments off
Analytics
Big Data
Cloud

Social

Are you wondering why constant and measured Social interaction via the web (and soon simple text messages) are such a big deal to business?

Hint: it's not necessarily about sharing your latest vacation picture with friends and family.

For a simple answer (ok, maybe not so simple) I encourage you to check out a behind the scenes look at how Analytics, Big Data and Cloud are used to drive and exploit Social behavior; watch the recent PBS FRONTLINE documentary "Generation Like".

As a fifty-something who grew up with one rotary phone in the house, I found it amazing.
As a data centric professional, I found it interesting.
As a business person, I found it profound.
As a parent, I found it disturbing.

I pose this rhetorical question: what do you think of the ABC S?

Of  course, multifaceted and "all of the above" answers permitted.


Read the original at DB2 for i.

Categories: Blogs Tags:

DB2 for i › You’re a DBE, now what?

January 27th, 2014 Comments off
In between client engagements this past week, I had the opportunity to sit down with a long time member of my Center of Excellence team and all around super consultant Jim Denton.  We talked about the progress made in getting more and more of our IBM i users to embrace the idea of a DB2 database engineer. The notion of having a person (or persons) in the building who really get, and truly focus on "DB2 for i" is taking hold, and it's paying dividends for those companies that are forward thinking.

Jim was relaying an interesting question he received from one of our favorite clients over in Dayton, Ohio.  This particular company has fully embraced the wholesale and winning combination of acquiring data centric knowledge and skills, as well as embarking on a targeted and well managed database modernization and re-engineering project.  Good for them!

The question posed goes something like this:

"Hey Jim, now that I'm a database engineer, what should I be doing day to day?"

First, this is a really insightful question. It means that the person is thinking strategically and is ahead of the curve.  Well done!


The What and Why


What you should be doing day-to-day as a DB2 for i engineer basically falls into two categories:

  1. Reactive
  2. Proactive

Both reactive and proactive activities generally involve: monitoring, analysis and something else.

The something else will depend on your focus and your client (i.e. who are you providing value to).

Why you should be doing this seems self explanatory, but if it's not, consider this:

Partnering with leaders on solving data centric business problems and getting more value out of data is of the utmost importance. Doing it proactively makes YOU more valuable and your company or organization more viable.

In terms of monitoring, assume you are an investigator.  You need to peek behind the curtain, look inside the black box, ask the probing questions. In short, you need to learn more about your data, the usage of data and the life cycle of data. You need to discover who is using the data and for what purpose.

Here are a few ideas to get you started. If you need more, please reach out and we can help you identify and prioritize activities that provide a good return on investment. We can also help you get better at identifying and partnering with your colleagues.

The Ideas


Narrow your scope to the top 10. Once you have a repeatable process, open the aperture and expand.

Profile your data. Learn more about the properties and trends of tables and indexes. How large are they. How fast are they growing. Who is using them.  How are they used. Who has access to the data. How is the data accessed. When is the data accessed.  Get the idea?  Know your data.

Take periodic and consistent snapshots of the the query plans. Store this data and use it to develop trends. What are the longest running queries. Who is running the queries. What are the attributes of the query requests (do you see simple statements, or complex statements).  Who is reading the three million results from the query run twice a day, every day (I can tell you... no one - this is an extract!).

The DB2 for i catalog is your friend. Get to know it. Use it.

Make a blueprint of your current data model. Pick a subject area or application and model the database objects, such as they are. Use reverse engineering tools and methods to be more productive. While you're at it, take this opportunity to learn a proper database modeling interface. Something that has more colors than just green.

Use the blueprint to identify gaps in your organization's relational database best practices. SQL queries are based on and driven by the data model. A bad data model means more work for the SQL query. A good data model means less work for the SQL query. If the model behaves well (i.e. well defined sets), then the SQL will behave well (i.e. good set operations).

Using the current data model and your gap analysis, define a future data model, build a new blueprint. Formulate a plan to modernize and re-engineer the database to incorporate the foundational elements that support data centric processing.

Watch and wait for an opportunity (i.e. business requirements) to implement the modernization and re-engineering plan. We like to do this on a targeted and cost justified basis. Better yet, don't wait; engage the business to understand pain points and new initiatives.

Get out of your office or cubical. Go meet people. Talk to your colleagues and clients at every level. Find out what they are struggling with. Find out what they are using the data for. Find out what the data is NOT providing them.

The information seekers in your organization are requesting data from IT. The data is extracted and downloaded to a PC. Once on the desktop, the real magic occurs. Peek behind the curtain, look inside the black box, ask the probing questions about what the data is used for and why. Better yet, ask the information seeker "what problem are you trying to solve?". This is how you get in a position to provide value on a proactive basis.

This is how you become relevant and stay relevant.

What else?

Well... what about you?

I recommend you strive to learn something new every week. Take an hour (or two) each and every week for yourself. Crack open the SQL Reference manual. Initiate a project for yourself. Do a proof of concept or proof of technology. Develop and hone your soft skills.

All work and no play makes Jack a dull boy.


Read the original at DB2 for i.

Categories: Blogs Tags:

DB2 for i › RESOLUTION

January 16th, 2014 Comments off

As we pass AND approach the various new year thresholds, this is a time when most of us symbolically reflect on the past, and plan for the future. This has been practiced for centuries in cultures all around the world.

While crossing the seasonal threshold and moving from one year to the next, there is a long standing tradition of making (and breaking) a resolution. Many of these annual resolutions involve a promise to do less, do more, or do better.

According to Merriam-Webster the definition reads...

res·o·lu·tion    noun \ˌre-zə-ˈlü-shən\

: the act of finding an answer or solution to a conflict, problem, etc.

: the act of resolving something

: an answer or solution to something

: the ability of a device to show an image clearly and with a lot of detail


I really like all of these explanations, and believe they apply very well to what we need to do as IT professionals (your personal resolutions are your own business).

When I reflect back on what my team has witnessed in countless IT organizations around the world, a few clear and distinct things come to mind.

Every business leader wants to have (IT) solutions that are:

  • Flexible
  • Extensible
  • Agile
  • Scalable
  • Timely

To meet these requirements, new tools, new techniques and new approaches need to be embraced. And if you ask me, doing so without throwing the baby out with the bath water. In other words, keeping what works and re-engineering what doesn't. Rarely do we find that it is cost effective or advantageous to start over from scratch. Some might call this approach "evolution, not revolution". The pace at which you evolve is a function of time, energy and funding. I would also throw in will power.

From a data perspective, business leaders want more relevant information. They want to move from data being the static and benign byproduct of a transaction or event, to data being the raw material that is refined into insightful information; something that provides a unique and otherwise hidden perspective on what's happening, what's coming, what's needed.

For many (most?) of my clients, it seems that data is dragging them down like some kind of massive anchor. A huge bucket of bits that is increasing in weight and volume, forever tugging on the organization's precious resources.

For a few keen companies, data is an enabler. This byproduct of their transactions or events will serve an important purpose. That is to become a lens to focus the past, and for looking into the future. Their data becomes an asset; something to be preserved, treasured and made useful. Something that becomes a unique advantage to their business, their partners and their customers.

Approaching Information Management Differently


Another major trend we see is that increasingly, the formal IT organization is being bypassed. Or at best, relegated to being just the conduit for data. This means that the effectiveness and more importantly, the value of traditional IT organizations is being eroded.

We see more and more business users acquiring information technology solutions directly - deploying and using them without the assistance or oversight of the IT organization. Obviously there are many concerns with this trend. The one I want you to focus on is: irrelevance.  As in, your knowledge, skill and expertise are irrelevant, no longer required.

Increasingly, users have access to the data. The various lines of business have growing requirements for information. If IT cannot provide the information, then they are asked only to provide the data. Sooner than later, the "value add" of IT will dry up. To be sure, IT should have a vital part to play in the acquisition, storage, management, and productive / safe use of data. But given the current trend, how do you stay relevant?  How do you continue to provide value?

One idea I have been sharing with my clients involves a change in philosophy and approach. In terms of engaging your business leaders, users and colleagues, move towards a consulting oriented discussion instead of service oriented discussion. As a consultant, you are in a position to guide and influence the community around you. And don't wait for them to come to you.  Be proactive. Go to them.  This is what makes you valuable and keeps you on the list of critical success factors.

Find a way to help, not hinder.

_________


For the coming year, let me suggest that you persevere to:

  • Stay relevant
  • Make and maintain a connection with your users clients
  • Provide real value

To accomplish these things (from an information management perspective), I recommend you personally resolve to:

  1. Learn more about the science and art of information management
  2. Gain control and governance of data and data access (don't be a Target)
  3. Get better at the design, architecture and modeling of database solutions

Happy New Year!



Read the original at DB2 for i.

Categories: Blogs Tags:

DB2 for i › DB2 for i SQL Classes for 2014

December 11th, 2013 Comments off
The DB2 for i Center of Excellence team has scheduled some high powered classes in Rochester, Minnesota (home of IBM i and DB2 for i) for 2014.  Make your plans now to take advantage of THE experts and in some cases, the folks who wrote the code.

For the data centric application architect and SQL programmer we have the DB2 for i Advanced SQL Workshop. This event is scheduled for February and September.

To view the description, dates and to enroll go  here  and be sure to check out the video overview at one of these sites:

English on YouTube

Spanish on YouTube

English on YouKu  (I have no control over the commercials!)

For the DB2 for i Engineer, or anyone interested in becoming more adept at understanding and solving SQL performance and scalability issues we have the DB2 for i SQL Performance Workshop. This event is scheduled for March and September.

To view the description and to enroll go  here  and check out the video overview at one of these locations:

English on YouTube

Spanish on YouTube

English on YouKu  (again, I have no control over the commercials!)

If you have any questions about the value of attending these workshops, please reach out to me.

If you are interested in bringing these knowledge and skills transfer workshops directly to your organization on a private basis, we can do that too.


Read the original at DB2 for i.

Categories: Blogs Tags:

DB2 for i › Big Data Analytics: an Example

November 1st, 2013 Comments off
During virtually ever presentation I've made this year I've been asked about "big data"...

What is it?
Why is it?
Can IBM i do it?

My brief answer usually takes the form of both a description, followed by a question back.

The subjective and objective description of Big Data and the notion of analytics have been covered here and in many other places, so I'll spare you the repetition. Do be aware that Big Data is so hot right now, everyone is jumping on the band wagon - using the phrase as a way to associate their products and services with the latest trend. Combining "Big Data" and "Cloud" results in a hurricane of marketing hype that is vast and powerful.

In my quest to provide clarity and understanding, the question I pose back to the audience goes something like this:

"Do YOU have big data, and do YOU have a requirement to analyze it"?

Most folks stare back at me, wondering whether they do or not. To help them really understand my question, I offer a simple (and favorite) example of Big Data Analytics:

"Imagine the capture and analysis of every tweet flowing through Twitter on a daily basis".

To be sure, most folks in my audience are not taking full advantage of the data they have in house now, much less able and willing to make use of the truly massive amounts of data flowing past their internet doorstep.

A Real Live Example


Lo and behold while traveling this week, I read an article that crisply illustrates my default example of Big Data Analytics.  The article highlights a new startup, Social Market Analytics, that looks at all the tweets pulsing through the world wide network.

They analyze the messages to develop a profile based on the Twitter "chatter" about a given company traded publicly on the stock market. The resulting profile and corresponding scoring of the dialog will (hopefully) provide to investors a crystal ball of sorts. At least that's what they are telling potential users of the service.

If you want to read the article for yourself (and I recommend you take time to do so), it appears in the October 15, 2013 edition of American Way Magazine.

And if you are still wondering what it takes to handle big data, imagine trolling through 400 million tweets per day, looking for the few pearl laden oysters, in near real time!


Read the original at DB2 for i.

Categories: Blogs Tags:

DB2 for i › Are You Ready to Advance Your Knowledge

October 14th, 2013 Comments off
In previous posts we've discussed how to become a database engineer, and we've talked about how to advance your knowledge and skills around SQL.  We've also chatted many times about the importance of learning the science and art of data centric design and programming.

To that end, I'm happy to announce a couple classes being offered by the DB2 for i Center of Excellence in Rochester, Minnesota (home of IBM i and DB2 for i).

For the application architect and SQL programmer we have the DB2 for i Advanced SQL Workshop. This event is scheduled for 12-14 November 2013

To view the description and to enroll go here.

For the DB2 for i Engineer, or anyone interested in becoming more adept at understanding and solving SQL performance and scalability issues we have the DB2 for i SQL Performance Workshop. This event is scheduled for 3-6 December 2013

To view the description and to enroll go here.

If you have any questions about the value of attending these public workshops, please reach out to me.

If you are interested in bringing these workshops directly to your organization on a private basis, we can do that too.


Read the original at DB2 for i.

Categories: Blogs Tags:
css.php