Archive

Archive for the ‘Blogs’ Category

DB2 for i › Introducing DB2 Web Query DataMigrator!

February 27th, 2015 Comments off
IBM has announced a new product offering in the DB2 Web Query family portfolio.

DB2 Web Query DataMigrator ETL Extension (i.e. DataMigrator) provides extract, transformation, and load (ETL) capabilities integrated with DB2 Web Query and DB2 for i.

Users looking to isolate their query workloads from production environments, or needing to automate the consolidation of data from a variety of data sources will be interested in this solution.

DataMigrator allows you to automate the process of extracting data from any DB2 Web Query supported source database, the process of transforming or cleansing the data and loading the data into an optimized DB2 for i reporting repository. Depending on the design and use of that repository, the industry refers to this as an operational data store, data mart, or data warehouse.

While leveraging the same graphical look and feel of DB2 Web Query, there are also integration points with DB2 Web Query. For instance, synchronize meta data defined either in DataMigrator or DB2 Web Query.

For more information, refer to the following documents:

IBM US Announcement

IBM Europe, Middle East, and Africa Announcement

Frequently Asked Question document

If you need assistance with architecture, design or implementation of your analytics environment running in IBM i, please reach out - we are here to make you successful!


Read the original at DB2 for i.

Categories: Blogs Tags:

Alan Seiden Consulting: PHP and IBM i Expertise › Video promo for WMCPA IBM i conference: March 10-12, 2015

February 22nd, 2015 Comments off

Join me and 20 other speakers at the Wisconsin Midrange Computer Professional Association (WMCPA) spring technical conference, March 10-12, 2015, at the Lake Lawn Resort on the shores on Delavan Lake.

Speakers: Aaron Bartell, Rob Bestgen, Larry Bolhuis, Tom Cremieux, Floyd Del Muro, Raymond Everhart, Margaret Fenlon, Susan Gantner, Charles Guarino, Scott Klement, Chris Koppe, Jon Paris, Mike Pavlak, Jim Ritchhart, Debbie Saugen, Alan Seiden, Dr. Frank Soltis, Robert Swanson, Robin Tatam, Jeff Tickner, Steve Will

Details: http://wmcpa.org/index.php/conference-2015/ibmi-conference-2015


Read the original at Alan Seiden Consulting: PHP and IBM i Expertise.

Pete's Wordshop › The great open source mashup on IBM i

February 18th, 2015 Comments off

To the outside world and perhaps 80% of the established IBM i community running open source applications on the IBM i operating system is irrelevant.  This is a crime of monumental proportions in that folks are standing up *NIX and Windows servers while they let their IBM i’s languish with “legacy” applications.  How sad for them. They are missing out on the benefits of leveraging those legacy apps with ‘state of the art’ open source frameworks that could make their shops more productive, scalable and secure.

Node is getting the latest buzz on the web and in the small but active open source community on IBM i.  I gotta admit that getting that first node app up was my goal (be first!) when it was first announced but my younger alter-ego, Aaron Bartell beat me to it (as usual).  But getting node.js stood up as a server is only the first step.  Getting a “hello world” app working is perhaps the second but the real fun is taking a few existing technologies and putting together a mashup that leverages old with new.  You IBM i folks have a boatload of RPG apps, how the heck would you hook them up with node.js?

socket.io and express.js being TCP/HTTP friendly make that a good place to start!  The IBM i is a rock solid HTTP server platform.  There are plenty of open source technologies that leverage HTTP on IBM i so using HTTP is a logical place to start.  How could we link things up so that node.js serves up apps that can communicate with and use other HTTP technologies?  Well, I threw together an RPG app that uses an HTTP POST to post data to the chat application example.  Check out a quick and dirty example I put together on my Github repo here. It isn’t as robust as I want it to be but I am just getting started here (and wanted to do more than just ‘Hello World’!)

Just think of the possibilities!  HTTP is the perfect way to mash up all things IBM i.  More to come!


Read the original at Pete's Wordshop.

RPG and Programming › Over my head

February 15th, 2015 Comments off

It has been a long time since my last post. My blog has obviously not been the biggest blip on my radar, but things keep happening, and I say, “I ought to write about that.” So here goes:

The biggest thing was my attempt to write a system to handle financial transactions using sockets over the Internet. RPG was not the language finally chosen; you can do sockets programming in RPG, but much of the problem involved continuing to handle processing while we did nightly batch processing and system saves.

The language I finally chose was Object Pascal. Not a bad choice, but it finally became apparent that I was much too used to doing file processing in RPG; sometimes you have to jump through hoops in other languages that you take for granted in RPG.

The project itself contains extremely intricate data processing. After a while (over a year) of hammering at it, and having a few minor successes and many major roadblocks, I finally threw in the towel. I suggested using ASNA’s visual RPG, but it seems that the process is being passed off to another programmer more familiar with network programming. I continue to believe that the process could done more cleanly by buying a package, but for some reason there is an aversion at our company to that - they prefer home-grown. At any rate, I am glad it is out of my hands. I never felt more inadequate as a programmer than when I was trying to make this process work. I was willing to keep on trying, but I would not have been able to make any promise as to when it would be done.

As always, I promise to try to write more. Hopefully, this will break the ice.


Read the original at RPG and Programming.

Categories: Blogs Tags: ,

Alan Seiden Consulting: PHP and IBM i Expertise › Overcome 8-character user profile limit in SSH for IBM i

February 6th, 2015 Comments off

By default, user profiles longer than eight (8) characters cannot connect to IBM i via Secure Shell (SSH). This limitation affects tools such as Zend Studio, which will show a message similar to “Failed to connect sshd on “<some ip address>.” Beginning with IBM i release 6.1, however, IBM provided a means of removing this limitation, so that user profiles longer than 8 characters could connect. Releases 6.1 and 7.1 require a PTF and a configuration change; release 7.2 and higher require only the configuration change.

For details, see  Zend Support specialist Rod Flohr’s article about how to remove the 8-character limit on user profiles when connecting via SSH.


Read the original at Alan Seiden Consulting: PHP and IBM i Expertise.

Categories: Blogs Tags: , , , ,

Pete's Wordshop › Stumbling startup with node.js on IBM i

February 5th, 2015 Comments off

My 7.2 upgrade was done specifically to get going with an “officially” supported version of node.js.  Installing 5733OPS is a no brainer (either RSTLICPGM or just using the option to install a licensed program in the LICPGM menu will work).  Node worked great.  Already creating much mayhem with socket.io, express.js, and a bunch of other stuff I will eventually break.  The problem I had is that I could only do this stuff when I was in the Node/bin folder.  Otherwise I would see:

node -v
exec(): 0509-036 Cannot load program node because of the following errors:
0509-150   Dependent module libstdc++.a(libstdc++.so.6) could not be loaded.
0509-022 Cannot load module libstdc++.a(libstdc++.so.6).
0509-026 System error: A file or directory in the path name does not exist.

What?   I am only a noob when it comes to *NIX environments and I know there are a bunch of moving parts that need to be correctly aligned.  I thought my PATH was the only thing I needed to look at but I was missing something essential: The LIBPATH environment variable.  Many executables share functionality so the “helper” objects need to be in the PATH as well so they can be found (like service programs or DLL’s) so the LIBPATH needs to be set as well.  My “Brogrammer” in the Open Source space, Aaron Bartell, turned me on to that fact and he proposes a couple of solutions.  You could just manually set the PATH and LIBPATH:

[Depending on the shell you use you can either do it in one step like this]

export PATH=/QOpenSys/QIBM/ProdData/Node/bin:$PATH
export LIBPATH=/QOpenSys/QIBM/ProdData/Node/bin:$LIBPATH

[or your shell may require that you do it in two steps like this]

PATH=/QOpenSys/QIBM/ProdData/Node/bin:$PATH
export PATH
LIBPATH=/QOpenSys/QIBM/ProdData/Node/bin:$LIBPATH
export LIBPATH

Or you could create a script and just run it when needed.  Here are some screen shots that show the problem and solutions:

node_error

So the problem occurs in #1 above even though I have the node/bin folder on my path (#2).  If I check the LIBPATH environment variable, it is empty so I append the path to the shared library to the LIBPATH (which just happens to be the same as the node/bin folder).  Adding the path and exporting it makes it available to my environment so now when I run node – v I get the version listed rather than an error.  Nice!  Aaron pointed me here to find this info and recommended that I post questions here so that others might benefit.  Agreed!  But I’ll probably also post here since I tend to forget where I posted solutions to my issues.

The script to do this is simple as well.  You should have a folder in your IBM i “home” folder under your IBM i user name.  Mine would be /home/PETE/  You could either use this command on IBM i:

EDTF ‘node_env.sh’   (you can call the script whatever you want) :

edtf_node_env

If you want to up your *NIX geek cred, you could use vi in SSH (or call qp2term):

vi /home/PETE/node_env.sh

vi_node_env

However you get there!  Then when you are in the console you can just execute the script:

node_script_fix

So, all is well.  Just remember to execute the node_env.sh script, OR, if you are going to launch a node server instance using CL, I usually create a script and then just execute the script in the CL program so you don’t have to have multiple entries to set the environment.  Just do your heavy lifting in the script, let your CL do the easy part.

So, that’s it!  There is probably more detail here than most folks need but for folks who are new to open source on IBM i, a little extra instruction can fill in a lot of blanks.

 


Read the original at Pete's Wordshop.

Bob Cancilla on IBM i › Cloud Computing — The Future of Computing Starts Today!

January 31st, 2015 Comments off

Cloud Computing

The future of computers and technology

Cloud computing is a reality in 2015 and is growing rapidly in use and popularity from the largest to the smallest of enterprises around the globe.  Cloud provides a robust stable computing environment at a fraction of the cost of owning and operating your own computer equipment.  It is highly scalable, secure, and usually includes built-in high availability and disaster recovery. 

Cloud computing can not only reduce the cost of computing equipment and the infrastructure necessary to support that equipment, but can also provide a dramatic reduction in labor costs. 

Software is loaded into the cloud and accessed remotely via the Internet.  You may choose from a vast array of commercial and open source software products or develop and deploy your own. 

A key advantage of cloud computing is the fact that the cloud vendor maintains the environment, the network, and all of the middleware required to run the equipment.  Depending on the size of your existing computing environment the labor savings in this area alone can be substantial. 

Another set of costs that you save are those associated with disaster recovery.  You do not need a DR site, or equipment along with DR testing and all the cost that goes into insuring that you can survive an unforeseen disaster.

A reputable cloud provider will replicate both your applications and data across several servers in several locations.  This replication means that if a location were to become unavailable due to a disaster of some sort your systems will continue to run at other locations.  Top notch Cloud Vendors offer multiple sites distributed around the globe with full system replication as a part of their base offering.

Growth and scalability is yet another benefit of Cloud computing.  Most reputable cloud vendors allow you or your staff the ability to increase the amount of resources they use on-line via an easy to use web interface or you may be configured to use whatever resources you require and be billed for usage much like that of an utility.  This means that your cost may escalate during peak periods of the year, but will go back down when the workload drops to lower levels.  Cloud usage scaling is much less costly than attempting to handle this problem in-house with your own equipment.

Many claim that security is a big risk of cloud computing.  I suggest that security is a huge advantage.  A reputable cloud vendor will employ both the very best security people that can be hired along with the best security monitoring software and engage ethical hackers to help product the network and your applications.  While technically security of your applications even in a cloud environment is your responsibility, a reputable cloud provider will offer security services built-in to your fees or at a moderate add-on cost that provides you much better security than you could afford on your own.

Through out this paper I have used the word “reputable” cloud vendor.  There are many articles on the Internet that speak to the risks of cloud computing.  They talk about the risks of allowing a 3rdpart to manage your data, disaster recovery, security, 3rd party employees with potential access to your data, etc. 

What is the solution?  Choose your Cloud provider very carefully.  There are many companies in the marketplace claiming to be “cloud providers”.  I came across one that specialized in AS/400 (IBM i) hardware and software hosting.  They had one data center in a single city.  How about another “cloud vendor” that meets most of the requirements I set forth but who operated three data centers in the same city! 

Addressing the issues about Cloud vendor employees with access to your data, you will find that most of that level of employee is bonded and insured.  Are your own employees bonded and insured protecting you against fraud, theft, or other malicious acts? 

Cloud computing reduces the cost of computer equipment ownership and support often generating savings in the millions of dollars.

How to get started?

Engage a reputable consulting firm that can take you from your existing environment to the cloud.  You will want to formulate a migration and modernization strategy.  Often that may include replacing existing software with vendor provide or open source software.  For other software you may wish to engage off-site contract developers to support you, to enhance your software, or to perform development on your behalf in lieu of the overhead of an in-house IT development staff.

We have found that most IT organizations waste a huge percentage of their annual budget.  We have found that many of the people you employ have dated skills that are not viable in today’s ever changing environment.  Additionally, we find that there are too many technologies for a small group of people to master and support.  Why not outsource both your hardware to a reputable cloud vendor and implement a business centric software management plan using outsourced resources.

A mid to large size IT Organization can save millions of dollars, even very small businesses with a limited number of employees (3 to 5) can obtain substantial savings.

Finally, in addition to cots savings, you will find that you can respond to the needs of your business much more rapidly than you can with in-house systems and resources. You will also find that you are able to better leverage modern and emerging technologies all helping to improve your bottom line.




Read the original at Bob Cancilla on IBM i.

Categories: Blogs Tags:

Pete's Wordshop › Moving to IBM i 7.2

January 29th, 2015 Comments off

Been meaning to do this for at least 6 months and finally bit the bullet and MOVED!  Naturally, the website has been down for days as I worked through several wrinkles.  On the IBM i OS side, I failed to pull my keys ahead of time.  Actually I DID download and install the keys. I also downloaded the updated OS and licensed programs and the latest CUME BUT the one thing I failed to do was to choose to “upgrade” on the IBM ESS site so although I had everything, the only keys it originally pulled was for base OS.  After I choose to “upgrade” on the ESS site, I suddenly had a boatload of keys which installed just fine.

When you upgrade once every few years there are always things you forget about, so it took a few IPL’s to get the everything running again.  So now I have a 7.2 and a 7.1 partition on my JS12 blade.  All is right with the world.

Well, ALMOST!  The web sites I had were mostly on their feet with a couple of exceptions.  My PHP sites were falling over and examining the logs revealed an interesting error:

exec(): 0509-036 Cannot load program /usr/local/ZendSvr/bin/php-cgi.bin because of the following errors:
rtld: 0712-001 Symbol EVP_md2 was referenced
from module php-cgi.bin(), but a runtime definition
of the symbol was not found.

Hmmm…not sure what was up with that.  I knew my Zendsvr install was relatively ancient (5.xx) so I decided to contact Mike Pavlak at Zend and see what I should do.   UPGRADE!   Of course that is what he said (duh!) so I downloaded the latest Zend Server version and installed it.  The beautiful new server access pages worked great!  But, my WordPress sites, and any other PHP based pages were still falling over.  So, back to Mike!  With a little bit of looking over the shoulder, Mike noticed that perhaps I was still back in the dark ages, referencing the wrong server (might it be the old Zend Core?) and he pointed me to a very helpful page that triggered enough neurons to get me thinking along the right lines….the problem might have been in the directives so he pointed me here.  Turns out it was the fast cgi directives….kind of.  Here is the deal.  I DID have the two things recommended by the page set up correctly:

LoadModule zend_enabler_module /QSYS.LIB/QHTTPSVR.LIB/QZFAST.SRVPGM

and

AddHandler fastcgi-script .php

So, I was a bit stumped….until I read on and saw the recommendation for the contents of the fastcgi.conf file.  My clue was at the very beginning of the file:

CommandLine=”/usr/local/zendsvr6/bin/php-cgi.bin”

Scroll back up and take a look at the path mentioned in the error I encountered:

Cannot load program /usr/local/ZendSvr/bin/php-cgi.bin

AHA! The fastcgi.conf is where the Apache directive finds how to execute the php page!   So, all I had to do was change any reference from /ZendSvr/ to /zendsvr6 in any of the fastcgi.conf files that I have.  Voila!  All is well!  All happily running under Zend Server 8.0 now.  Thanks Mike!

Next I had to update WordPress.  The automatic update has never really worked correctly so I did a manual update and made the mistake of using the readme.html in the WordPress bundle file to guide me.  Don’t use that for upgrading!  Use the instructions found here. Once I took the proper approach, WordPress was up and running in no time.

So, now it is time to play, play, play with all that IBM i has new in 7.2!


Read the original at Pete's Wordshop.

Categories: Blogs Tags:

DB2 for i › Using the Blueprint

January 28th, 2015 Comments off
In the previous post, Dan Cruikshank illuminated the concept of creating a database "blueprint" through the process of data modeling.  Continuing my conversation with Dan, I asked him about using the blueprint, i.e. data model.  Specifically, using SQL to access the data (the answer is YES!) and whether or not it makes more sense to use embedded SQL in a traditional high level language like RPG, or to use separate, and modular Stored Procedures.

Dan's answer and insight follows...

__________


The answer to the question as to use SQL Stored Procedures (or External Stored Procedures) versus embedded SQL in a monolithic program depends on the environment in which the application exists. What that means is if you are developing both host centric as well as external (i.e. web or mobile) applications which need to perform common IO functions, such as consuming result sets, inserting, updating and deleting rows from and to a common set of tables, etc., then using stored procedures would be my recommendation. If all development is strictly host centric (should not be in this day and age) then embedded SQL would be ok, but not best practice, in my opinion.

From an application methodology perspective, we tend to recommend a Model/View/Controller (MVC) approach for design and development, where the data access and user interface are separated from the high level language code. In the case of data access this would be done via stored procedure calls.

This is not a performance based recommendation; it is more about re-usability, reduced development costs and shorter time to market. Done properly, better performance is a byproduct. Not to mention higher value returned to the business!

In addition, with the added SQL DML support for result set consumption in IBM i 7.1, it is now easier for the host centric applications (i.e. RPG and COBOL) to share a common set of procedures which return result sets. From an SQL DML perspective, prior to 7.1 this was limited to insert, update and delete procedures although accessing a result set from a stored procedure was available via CLI calls (not widely used in traditional IBM i shops).

__________


If you need additional knowledge, skills or guidance with what Dan is sharing, please do not hesitate to reach out.


Read the original at DB2 for i.

Categories: Blogs Tags:

Alan Seiden Consulting: PHP and IBM i Expertise › Free webinars sponsored by COMMON Europe (open to all)

January 23rd, 2015 Comments off

Starting January 27, 2015, I’ll be giving three free Tuesday webinars:

  • January 27: Strategic Modernization with PHP
  • February 17: Bring RPG/COBOL business logic to the web with the PHP Toolkit
  • March 10: Speedy PHP on IBM i

All three one-hour webinars will be held at 14:00 Central European Time (CET). That’s 8 AM Eastern Standard Time (EST). The registration page includes a time zone converter.

Details and to register: http://www.data3.se/?p=5287

Thanks to Torbjörn Appehl of Data3 (COMMON Sweden) for organizing these.


Read the original at Alan Seiden Consulting: PHP and IBM i Expertise.

css.php