Infosphere Information Server (IIS) – Where you can view DataStage and QualityStage Logs?

During the course of the week, the discussion happened regarding the different places where a person might read the DataStage and QualityStage logs in InfoSphere. I hadn’t really thought about it, but here are a few places that come to mind:

  • IBM InfoSphere DataStage and QualityStage Operations Console
  • IBM InfoSphere DataStage and QualityStage Director client
  • IBM InfoSphere DataStage and QualityStage Designer client by pressing Ctrl+L

Printable PDF Version of this Article

Related Reference

IBM Knowledge Center> InfoSphere Information Server 11.7.0 > InfoSphere DataStage and QualityStage > Monitoring jobs

IBM Knowledge Center > InfoSphere Information Server 11.7.0 > Installing > Troubleshooting software installation > Log files

Essbase Connector Error – Client Commands are Currently Not Being Accepted

DataStage Essbase Connector, Essbase Connector Error, Client Commands are Currently Not Being Accepted

DataStage Essbase Connector

While investigating a recent Infosphere Information Server (IIS), Datastage, Essbase Connect error I found the explanations of the probable causes of the error not to be terribly meaningful.  So, now that I have run our error to ground, I thought it might be nice to jot down a quick note of the potential cause of the ‘Client Commands are Currently Not Being Accepted’ error, which I gleaned from the process.

Error Message Id


Error Message

An error occurred while processing the request on the server. The error information is 1051544 (message on contacting or from application:[<<DateTimeStamp>>]Local////3544/Error(1013204) Client Commands are Currently Not Being Accepted.

Possible Causes of The Error

This Error is a problem with access to the Essbase object or accessing the security within the Essbase Object.  This can be a result of multiple issues, such as:

  • Object doesn’t exist – The Essbase object didn’t exist in the location specified,
  • Communications – the location is unavailable or cannot be reached,
  • Path Security – Security gets in the way to access the Essbase object location
  • Essbase Security – Security within the Essbase object does not support the user or filter being submitted. Also, the Essbase object security may be corrupted or incomplete.
  • Essbase Object Structure –  the Essbase object was not properly structured to support the filter or the Essbase filter is malformed for the current structure.

Related References

IBM Knowledge Center, InfoSphere Information Server 11.7.0, Connecting to data sources, Enterprise applications, IBM InfoSphere Information Server Pack for Hyperion Essbase

Printable PDF Version of This Article


DataStage – How to Pass the Invocation ID from one Sequence to another

DataStage Invocation ID Passing Pattern Overview

DataStage Invocation ID Passing Pattern Overview

When you are controlling a chain of sequences in the job stream and taking advantage of reusable (multiple instances) jobs it is useful to be able to pass the Invocation ID from the master controlling sequence and have it passed down and assigned to the job run.  This can easily be done with needing to manual enter the values in each of the sequences, by leveraging the DSJobInvocationId variable.  For this to work:

  • The job must have ‘Allow Multiple Instance’ enabled
  • The Invocation Id must be provided in the Parent sequence must have the Invocation Name entered
  • The receiving child sequence will have the invocation variable entered
  • At runtime, a DataStage invocation id instance of the multi-instance job will generate with its own logs.

Variable Name

  • DSJobInvocationId


This approach allows for the reuse of job and the assignment of meaningful instance extension names, which are managed for a single point of entry in the object tree.

Related References: 

IBM Knowledge Center > InfoSphere Information Server 11.5.0

InfoSphere DataStage and QualityStage > Designing DataStage and QualityStage jobs > Building sequence jobs > Sequence job activities > Job Activity properties

Datastage – When checking operator: Operator of type “APT_TSortOperator”: will partition despite the preserve-partitioning flag on the data set on input port 0

APT_TSortOperator Warning

APT_TSortOperator Warning

The APT_TSortOperator  warning happens when there is a conflict in the portioning behavior between stages.  Usually, because the successor (down Stream) stage has the ‘Partitioning / Collecting’ and ‘Sorting’ property set in a way that conflicts with predecessor (upstream) stage’s properties, which it is set to preserver.  This can occur when the successor stage has the “Preserve Partitioning” property set to:

  • ‘Default (Propagate)’
  • ‘Propagate’, or
  • ‘Set’
Preserve Partitioning Property - list

Preserve Partitioning Property – list

Message ID

  • IIS-DSEE-TFOR-00074

Message Text

  • <<Link Name Where Warning Occurred>>: When checking operator: Operator of type “APT_TSortOperator”: will partition despite the preserve-partitioning flag on the data set on input port 0.

Warning Fixes

  • First, if the verify that the partitioning behaviors of both stages are correct
  • If so, set the predecessor ‘Preserve Partitioning’ property to “Clear”
  • If not, then correct the partitioning behavior of the stage which is in error

Clear Partitioning Property Screenshot

Preserve Partitioning Property - Set To Clear

Preserve Partitioning Property – Set To Clear

Infosphere DataStage – Boolean Handling for Netezza

Datastage Director Message - Numeric string expected

Datastage Director Message – Numeric string expected


Beware when you see this message when working with Boolean in DataStage, the message displays as informational (at list it did for me) not as a warning or an error.  Even though it seems innocuous, what it meant for my job, was the Boolean (‘true’ / ‘false’) was not being interpreted and everything posted to ‘false’.

In DataStage the Netezza ‘Boolean’ field/Data SQL type maps to the ‘Bit’ SQL type, which expects a numeric input of Zero (0) or one (1).  So, my solution (once I detected the problem during unit testing) was to put Transformer Stage logic in place to convert the Boolean input to the expected number value.


Netezza to Datastage Data Type Mapping

Netezza data types

InfoSphere DataStage

data types (SQL types)

Expected Input value

BOOLEAN Bit 0 or 1 (1 = true, 0 = false)


Transformer Stage logic Boolean Handling Logic

A Netezza Boolean field can store: true values, false values, and null. So, some thought should be given to you desired data outcome for nulls

This first example sets a that the nulls are set to a specific value, which can support a specific business rule for null handling and, also, provide null handling for non-nullable fields.  Here we are setting nulls to the numeric value for ‘true’ and all other non-true inputs to ‘false’.

If isnull(Lnk_Src_In.USER_ACTIVE) then 1 Else if Lnk_Src_In.USER_ACTIVE = ‘true’ Then 1 Else 0

These second examples sets a that the nulls are set by the Else value, if your logic direction is correct value and still provides null handling for non-nullable fields.

  • If  Lnk_Src_In.USER_ACTIVE = ‘true’ Then 1 Else 0

  • If  Lnk_Src_In.USER_ACTIVE = ‘False’ Then 0 Else 1

Director Log Message

Message ID

  • IIS-DSEE-TBLD-00008

Message Text

  • <<Link Name Where Message Occurred>>: Numeric string expected. Use default value.

Or something like this:

  • <<Link Name Where Message Occurred>>: Numeric string expected for input column ‘<<Field Name Here>>‘. Use default value.

Related References


PureData System for Analytics, PureData System for Analytics 7.2.1, IBM Netezza user-defined functions, UDX data types reference information, Supported data types, Boolean

Data types and aliases

PureData System for Analytics, PureData System for Analytics 7.2.1, IBM Netezza stored procedures, NZPLSQL statements and grammar, Variables and constants, Data types and aliases

Logical data types

PureData System for Analytics, PureData System for Analytics 7.2.1, IBM Netezza database user documentation, Netezza SQL basics, Data types, Logical data types

Data type conversions from Netezza to DataStage

InfoSphere Information Server, InfoSphere Information Server 11.5.0, Connecting to data sources, Databases, Netezza Performance Server, Netezza connector, Designing jobs by using the Netezza connector, Defining a Netezza connector job, Data type conversions, Data type conversions from Netezza to DataStage

*DataStage*DSR_PROJECT (Action=8); check DataStage is set up correctly in project




Basically, the Action=8 error, which I normally see when opening the DataStage Director Client application, means that one or more of the RT_LOG files have become corrupted.  Usually, this problem occurs in relation to disk space issues; although, there can be other causes.

Error Message

Error calling subroutine: *DataStage*DSR_PROJECT (Action=8); check DataStage is set up correctly in project

(Subroutine failed to complete successfully (30107))

The Cleanup approach

The cleanup process really consists of three primary steps:

  • Free disk space
  • Restart application process
  • And, fix corrupted log

Free Disk Space

This can consist of:

  • Cleaning ‘/tmp’ Space
  • Removing any large unnecessary files
  • Enlarging ‘/tmp’ space allocation
  • Adding addition disks space, if necessary

Restart Application Processes

Once you have free the disk space available restarting VM/server is recommended, However, if that is not a realistic option, then at least reboot the Infosphere Datastage engine to ensure the newly freed memory is registering with the applications and to ensure everything is restarted and running.

Fix Corrupted logs

Perhaps, the cleanest way reset the all logs is to perform a ‘Multiple Job Compile’.  Running the jobs will also over write the logs, but is a little more hit and miss, if not all the jobs are not in job streams/batches, which can be run at this time.  The logs can be manually overwritten by compiling the job or performing a reset.  The trick, with manual reset is that you have to know which job to reset, so, this could take a while to get them all. The logs can be manually, dropped and reset, but I recommend that approach only as a last resort.



*DataStage*DSR_SELECT (Action=3); check DataStage is set up correctly in project



Having encountered this DataStage client error in Linux a few times recently, I thought I would document the solution, which has worked for me.

Error Message:

Error calling subroutine: *DataStage*DSR_SELECT (Action=3); check DataStage is set up correctly in project

(Subroutine failed to complete successfully (30107))

Probable Cause of Error

  • NodeAgents has stopped running
  • Insufficient /temp disk space

Triage Approach

To fix this error in Linux:

  • Ensure disk space is available and you may want clean up the /tmp directory of any excel non-required files.
  • Start the, if it is not running

Command to verify Node Agent is running

ps -ef | grep java | grep Agent


Command to Start Node Agent

This example command assumes the shell script is in its normal location, if not you will need to adjust the path.

/opt/IBM/InformationServer/ASBNode/bin/ start

Node Agent Logs

These logs may be helpful:

  • asbagent_startup.err
  • asbagent_startup.out

Node Agent Logs Location

This command will get you to where the logs are normally located:

cd /opt/IBM/InformationServer/ASBNode/

Infosphere Datastage – Client Tier Image Requirements

InfoSphere Information Server (IIS) DataStage Client Tools

InfoSphere Information Server (IIS) DataStage Client Tools

A frequent question encountered in the early stages of a client engagement is: what are the recommended Windows Client Tier Image Requirements (Virtual machine image or Desktop)?  However, I have never found this information to be satisfactorily documented by IBM.  So, invariably, we end up providing our best guidance based on experience, which in the case of the normal InfoSphere Information Server (IIS) installation is provided in the table below.

Recommended Developer Client Tier Characteristics

Item Quantity Notes

Application Directory Storage

8 GB or Larger


6 GB or More This should be a developer images, so, more is better.  Especially, given the other applications, which the developer may also be using simultaneously.
CPU Cores 2 or more This should be a developer images, so, more is better.  Especially, given the other applications, which the developer may also be using simultaneously.


N/A Users performing the client software installation, must have full Administrative rights on the Virtual Machine image or individual workstation.

Protection Software

N/A All Firewalls and virus protection software needs to be disabled on the client, while client software installation is in progress.  Also, required Infosphere firewall ports for the client tools must be open to use the tools.


Related References

Are the Infosphere Windows Client 32 bit or 64 bit?

InfoSphere Information Server (IIS) DataStage Client Tools

InfoSphere Information Server (IIS) DataStage Client Tools

The question of whether or not the IBM InfoSphere Windows client tools 32 bit or 64 bit, actually, comes up rather frequently. The short answer to the question is that the InfoSphere Windows client tools, actually, are 32 bit applications, will run on supported 64 bit windows system.  This is what IBM calls 64-Tolerate: (32-bit applications that can run in a 64-bit operating system environment), however, the client tools do not run in native 64-bit mode and do not exploit the 64-bit operating environment to improve performance.

Related References


DataStage – IIS-DSEE-TBLD-00008- Processing Input Record APT_Decimal Error

IIS-DSEE-TBLD-00008 apt decimal error before disabling combination

IIS-DSEE-TBLD-00008 apt decimal error

This another one of those nebulas error messages, which can cost a lost of time in research, if you don’t know how to simplify the process a bit.  However, determining where the error is can be a bit of a challenge if you have not encountered this error before and figured out the trick, which isn’t exactly intuitive.

In this case, as it turned out, after I had determined where the error was, it was as simple as having missed resetting the stage variable properties, when the other decimal fields were increase.

How to identify where this error occurs?

Disabling the APT_DISABLE_COMBINATION environment variable by:

  • adding the APT_DISABLE_COMBINATION environment variable to the job properties

  • setting the  APT_DISABLE_COMBINATION environment variable it to true in the job properties

  • compiling the job and running the job again


This approach will, usually, provide a more meaningful identification of the stage with the error.

Note:  Please remember to remove the APT_DISABLE_COMBINATION environment variable before moving it to testing and/or releasing your code in production.

Message ID


Error Message with combine enabled:

APT_CombinedOperatorController(1),0: Exception caught in processingInputRecord() for input “0”: APT_Decimal::ErrorBase: From: the source decimal has even precision, but non-zero in the leading nybble, or is too large for the destination decimal… Record dropped. Create a reject link to save the record if needed.

Error message with combine disabled

Tfm_Measures_calc,0: Exception caught in processingInputRecord() for input “0”: APT_Decimal::ErrorBase: From: the source decimal has even precision, but non-zero in the leading nybble or is too large for the destination decimal… Record dropped. Create a reject link to save the record if needed.

IIS-DSEE-TBLD-00008 apt decimal error after disabling combination

IIS-DSEE-TBLD-00008 apt decimal error after disabling combination

Note: Measures_calc is the stage name

Related References



Infosphere – decimal_from_string Conversion Error

IBM Infosphere - decimal_from_string Conversion Error

decimal_from_string Conversion Error

This is another one of those nebulas error, which can kick out of DataStage, DataQuality, and/or DataClick.  This error can be particularly annoying, because it doesn’t identify the field or even the precise command, which is causing the error.  So, there can be more than field and/or more than one command causing the problem.


Conversion error calling conversion routine decimal_from_string data may have been lost


To resolve this error, check for the correct formatting (date format, decimal, and null value handling) before passing to datastage StringToDate, DateToString,DecimalToString or StringToDecimal functions.  Additionally, even if the formatting is correct, you may need to imbed commands to completely clear the issue.


Here is a recent example of command embedding, which has clear the issue, but I’m sure you will need to this concept in other ways to meet all your needs.

DecimalToString( DecimalToDecimal( <>,’trunc_zero’),”suppress_zero”)

Infosphere Director Scheduler

Infosphere Director Scheduler

Infosphere Director Scheduler


First let us start with dispelling a common myth, The InfoSphere scheduler is not an Enterprise Scheduler application.

The scheduling service in InfoSphere leverages the operating system (OS) scheduler, in the case of Linux this is CRON, and provides graphical User Interface, which a provides time based capability to schedule Jobs at the Suite component level.  The Director Client scheduler can:

  • Schedule individual jobs and sequence jobs
  • Schedule Jobs/Sequencers to run:
  • Today
  • Tomorrow
  • Every
  • Next
  • Daily

How to set-up the Daily Schedule


Steps to set the Daily Schedule are below:

  1. Open the DataStage Director
  2. Once the browser is open click on Job>Add to Schedule
  3. Click “Daily” under the “Run Job” and choose the time

Related References


How to suppress a Change_Capture_Host warning in Datastage

Change Capture Host Warning (IIS-DSEE-TFXR-00017)

Change Capture Host Warning (IIS-DSEE-TFXR-00017)

Occasionally, I run into this Change Capture Host defaulting warming, so, I thought this information may be useful.

Event Type

  • Warning

Message ID

  • IIS-DSEE-TFXR-00017

Example Message

  • Change_Capture_Host: When checking operator: Defaulting “<<FieldName>>” in transfer from “beforeRec” to “outputRec”.

Setting Variable

  • Set APT_DISABLE_TFXR0017=1
  • This environment variable can be added either at the project level or at the job level.

Alternative Solution

  • Within the Change Capture stage properties:
    • Stage tab
    • Option
    • Property: “Change Mode
    • Value:  “Explicit key, All Values”.

Netezza JDBC Error – Unterminated quoted string

The ‘Unterminated quoted string’ error occurs from time to time when working with the InfoSphere DataStage Netezza JDBC Connector stage and is nebulas, at best.  However, the solution is, normally, straight forward enough once you understand it.  Usually, this error is the result of target table fields or field being shorter than the input data.  The fix is, normally, to compare you input field lengths (or composite field length, if consolidation fields into one field) and adjusting the field length higher.  In some cases, if business rules allow you may be able to substring or truncate the input data length (not a recommended approach), but information can be lost with this approach.


org.netezza.error.NzSQLException: ERROR:  Unterminated quoted string

Example Error Message


Tgt_IIS_Job_Dim,0: The connector encountered a Java exception:  org.netezza.error.NzSQLException: ERROR:  Unterminated quoted string    at org.netezza.internal.QueryExecutor.getNextResult(    at org.netezza.internal.QueryExecutor.execute(  at org.netezza.sql.NzConnection.execute(       at org.netezza.sql.NzStatement._execute(           at org.netezza.sql.NzPreparedStatament.executeUpdate(   at               at

How to Schedule an Infosphere Datastage Job in crontab

This is a quick, easy, shortcut way to schedule an InfoSphere DataStage job in Linux crontab to take advantage of capabilities within crontab not available in the InfoSphere graphical user interface (GUI).

For this example, the job has been adjusted from the stand InfoSphere scheduler graphical user interface (GUI) setting to run every 15 minutes, which is not available in the GUI.

The Basic crontab Scheduling Process

  • Schedule the job in DataStage Director
  • Login into Linux as the user, who created the schedule
  • Run ‘crontab -e’ command
Linux crontab -e

Linux crontab -e

  • Edit crontab command line using VI commands
Edited InfoSphere Crontab Command 5 minute Intervals

Edited InfoSphere Crontab Command 15 minute Intervals

  • Saves changes


Note: The revised schedule if different from the InfoSphere scheduler GUI standard setting will not display as changed in the GUI.  However, the jobs will run as scheduled, if edited correctly, and can be verified in the Director Client.

Related links:

How to Export a Job Log from Director to a file

IIS DataStage Director Log Export

InfoSphere Information Server DataStage Director Log Export


  • How do I save the log of a DataStage Job into a file using DataStage Director?


  • Basically, the log is exported by being printed to a file.

To export and/or Print a Datastage Director log

Set filter

  • Select View > Filter Entries
  • Select ‘Start of Last Run’ to print only the most recent run log.
  • Make sure all Types of Messages are selected to export a complete set of log entries.

Print/export out the Job Log

  • Make sure you are viewing the log details, then,
  • Print the current view:
    1. Display the Print dialog box by:
      1. Choose Project > Print, or
      2. Click the Print button on the toolbar.
    2. Choose the Range of items to print in the Range area:
      1. Select print ‘All entries’ in the current view.
    3. Choose the Print what area:
      1. Select ‘Full details’ to print all details each log entry.
    4. Select the ‘Print to file’ check box.
    5. Click OK.
    6. When the Print to file dialog box appears:
      1. Choose the path to which the file will be saved, and
      2. Enter the name of a text file to use.


Note: The default is DSDirect.txt in the client directory.





Infosphere Information Server (IIS) Component Alignment

Infosphere Information Server SDLC Alignment

Infosphere Information Server SDLC Alignment

In recent history, I have been asked several times to describe where different IIS components fit in the Software Development Lifecycle (SDLC) process.  The graphic above, list most of the more important IIS components in their relative SDLC relationships. However, it is important to note that that these are not absolutes. Many applications may cross boundaries depending on the practices of the individual company, the application spurt licensed by the company, and/or the applications implemented by the company.  For example, many components will participate in the sustainment phase of SDLC, although I did not list him in that role. This is especially true, if you’re using the governance tools (e.g. governance catalog ) and supporting your sustainment activities with modeling and development tools, such as, data architect.

Related References

InfoSphere Information Server (IIS) Component Descriptions

Infosphere Information Server LaunchPad Page

Infosphere Information Server LaunchPad Page


Each IIS component has a primary function in the InfoSphere architecture, which can by synopsized as follows:


Application Function
Blueprint Director IBM InfoSphere Blueprint Director is aimed at the Information Architect designing solution architectures for information-intensive projects.
Cognos (If purchased) Governance Dashboard (Framework Manager Model provided by IBM), Semantics, Analytics, and Reporting
Data Architect Data Architect is an enterprise data modeling and integration design tool. You can use it to discover, model, visualize, relate, and standardize diverse and distributed data assets, including dimensional models.
Data Click Data Click is an exciting new capability that helps novices and business users retrieve data and provision systems easily in only a few clicks.
Datastage DataStage is a data integration tool that enables users to move and transform data between operational, transactional, and analytical target systems.
Discovery Discovery is used to identify the transformation rules that have been applied to source system data to populate a target. Once accurately defined, these business objects and transformation rules provide the essential input into information-centric projects.
Fasttrack FastTrack streamlines collaboration between business analysts, data modelers, and developers by capturing and defining business requirements in a common format and, then, transforming that business logic (Source-to-Target-Mapping (STTM)) directly into DataStage ETL jobs.
Glossary Anywhere Business Glossary Anywhere, its companion module, augments Governance Catalog with more ease-of-use and extensibility features.
Governance Catalog The Governance Catalog includes business glossary assets (categories, terms, information governance policies, and information governance rules) and information assets.
Information Analyzer Information Analyzer provides capabilities to profile and analyze data.
Information Services Director Information Services Director provides a unified and consistent way to publish and manage shared information services in a service-oriented architecture (SOA).
Metadata Asset Manager Import, export, and manage common metadata assets in Metadata Repository and across applications
Operations Console Admin workspaces to investigate data, deploy applications, Web services, and monitor schedules and logs.
Qualitystage QualityStage provides data cleansing capabilities to help ensure quality and consistency by standardizing, validating, matching, and merging information to create comprehensive and authoritative information.
Server Manager Deployment tool to move, deploy, and control DataStage and QualityStage assets.

Related References

What are the Infosphere DataStage job status log values?

IBM Infosphere Information Server (IIS), DataStage job status log values, datastage

IBM Infosphere Information Server (IIS)

These are the job status codes seen when running Infosphere Datastage jobs and sequences.   Additional information regarding a specific job or sequence error can be seen in Director.

Table of dsjob utility Status Codes

dsjob utility Status Codes


Log/Status Description Job State Comments
0 Running Not Runnable This is the only status that means the job is actually running
1 Finished Runnable Job finished a normal run with no warnings
2 Finished (See Log) Runnable Job finished a normal run with warnings
3 Aborted Not Runnable Job finished a normal run with a fatal error
4 Queued Not Runnable Job queued waiting for resource allocation
8 Failed Validation Not Runnable
9 Has Been Reset Runnable
11 Validated OK Runnable
12 Validated (See Log) Runnable
13 Failed Validation Not Runnable
21 Has Been Reset Runnable
96 Aborted Not Runnable
97 Stopped Not Runnable
98 Not Compiled Not Runnable