Infosphere Information Server (IIS) – Where you can view DataStage and QualityStage Logs?

During the course of the week, the discussion happened regarding the different places where a person might read the DataStage and QualityStage logs in InfoSphere. I hadn’t really thought about it, but here are a few places that come to mind:

  • IBM InfoSphere DataStage and QualityStage Operations Console
  • IBM InfoSphere DataStage and QualityStage Director client
  • IBM InfoSphere DataStage and QualityStage Designer client by pressing Ctrl+L

Printable PDF Version of this Article

Related Reference

IBM Knowledge Center> InfoSphere Information Server 11.7.0 > InfoSphere DataStage and QualityStage > Monitoring jobs

IBM Knowledge Center > InfoSphere Information Server 11.7.0 > Installing > Troubleshooting software installation > Log files

Essbase Connector Error – Client Commands are Currently Not Being Accepted

DataStage Essbase Connector, Essbase Connector Error, Client Commands are Currently Not Being Accepted

DataStage Essbase Connector

While investigating a recent Infosphere Information Server (IIS), Datastage, Essbase Connect error I found the explanations of the probable causes of the error not to be terribly meaningful.  So, now that I have run our error to ground, I thought it might be nice to jot down a quick note of the potential cause of the ‘Client Commands are Currently Not Being Accepted’ error, which I gleaned from the process.

Error Message Id

  • IIS-CONN-ESSBASE-01010

Error Message

An error occurred while processing the request on the server. The error information is 1051544 (message on contacting or from application:[<<DateTimeStamp>>]Local////3544/Error(1013204) Client Commands are Currently Not Being Accepted.

Possible Causes of The Error

This Error is a problem with access to the Essbase object or accessing the security within the Essbase Object.  This can be a result of multiple issues, such as:

  • Object doesn’t exist – The Essbase object didn’t exist in the location specified,
  • Communications – the location is unavailable or cannot be reached,
  • Path Security – Security gets in the way to access the Essbase object location
  • Essbase Security – Security within the Essbase object does not support the user or filter being submitted. Also, the Essbase object security may be corrupted or incomplete.
  • Essbase Object Structure –  the Essbase object was not properly structured to support the filter or the Essbase filter is malformed for the current structure.

Related References

IBM Knowledge Center, InfoSphere Information Server 11.7.0, Connecting to data sources, Enterprise applications, IBM InfoSphere Information Server Pack for Hyperion Essbase

Printable PDF Version of This Article

 

What Are The DataStage / QualityStage Join Stages?

Three Stages Which Join Records

Three Stages Which Join Records

While chasing an error to which only applied to join type stages, I thought it might be nice to identify what the InfoSphere Information Server DataStage / QualityStage are.  There are three of them, as you can see from the picture above, which are the:

  • Join Stage,
  • Lookup Stage,
  • And, Merge Stage.

All three stages that join data based on the values of identified key columns.

Related References

IBM Knowledge Center, InfoSphere Information Server 11.7.0, InfoSphere DataStage and QualityStage, Developing parallel jobs, Processing Data, Lookup Stage

IBM Knowledge Center, InfoSphere Information Server 11.7.0, InfoSphere DataStage and QualityStage, Developing parallel jobs, Processing Data, Join Stage

IBM Knowledge Center, InfoSphere Information Server 11.7.0, InfoSphere DataStage and QualityStage, Developing parallel jobs, Processing Data, Merge Stage

https://www.ibm.com/support/knowledgecenter/SSZJPZ_11.7.0/com.ibm.swg.im.iis.ds.parjob.dev.doc/topics/c_deeref_Merge_Stage.html

DataStage – How to Pass the Invocation ID from one Sequence to another

DataStage Invocation ID Passing Pattern Overview

DataStage Invocation ID Passing Pattern Overview

When you are controlling a chain of sequences in the job stream and taking advantage of reusable (multiple instances) jobs it is useful to be able to pass the Invocation ID from the master controlling sequence and have it passed down and assigned to the job run.  This can easily be done with needing to manual enter the values in each of the sequences, by leveraging the DSJobInvocationId variable.  For this to work:

  • The job must have ‘Allow Multiple Instance’ enabled
  • The Invocation Id must be provided in the Parent sequence must have the Invocation Name entered
  • The receiving child sequence will have the invocation variable entered
  • At runtime, a DataStage invocation id instance of the multi-instance job will generate with its own logs.

Variable Name

  • DSJobInvocationId

Note

This approach allows for the reuse of job and the assignment of meaningful instance extension names, which are managed for a single point of entry in the object tree.

Related References: 

IBM Knowledge Center > InfoSphere Information Server 11.5.0

InfoSphere DataStage and QualityStage > Designing DataStage and QualityStage jobs > Building sequence jobs > Sequence job activities > Job Activity properties

Common Information Technology Architectures

Overview Of Common Information Technology Architectures

The world is currently in the Information and Technology era, were as, so many experts are of the opinion that the Silicon Valley days are beginning to come to an end. Information and Technology is basically what the world revolves around today which makes it necessary to consider some technical overview of Information and Technology architecture use. The term Information Technology is often used in place for computer networks, and it also surrounds other information related technologies like television, cell phones and so on, showing the connection between IT and ICT (thou IT and ICT are often used to replace each other but technically are different). When talking about IT architectural, it is the framework or basis that supports an organization or system. Information technology architectural concerning computing involves virtual and physical resources supporting the collection, processing, analysis and storage of data. The architecture, in this case, can be integrated into a data center or in some other instances decentralized into multiple data centers, which can be managed and controlled by the IT department or third-party IT firm, just like cloud provider or colocation facility. IT architectures usually come into play when we consider hardware for computers (Big Iron: mainframe & Supercomputers), software, internet (LAN / WAN Server based), e-commerce, telecom equipment, storage (Cloud) and so on.

Information Technology Industry Overview

Information Technology Industry Overview

We human beings have been able to manipulate, store, and retrieve data since 3000Bc, but the modern sense of information technology first appeared in an article in 1958 published in a Havard Business Review: Harold j.Leavitt and Thomas L.whisler were the authors, and they further commented that the new technology was lacking an established name. It shall be called information technology (IT). Information Technology is used in virtually all sectors and industries, talking about education, agriculture, marketing, health, governance, finance and so on. Whatever you do, it is always appropriate to have a basic overview of the architectural uses of Information Technology. Now we take a look at some standard Information technology architectures use with regards to technology environment patterns such as Big Iron (mainframe & Supercomputers); Cloud; LAN / WAN Server based; storage (Cloud).

Big Iron (Mainframe & Supercomputers)

Big iron is a term used by hackers, and as defined in the hacker’s dictionary the Jargon File refers to it as “large, expensive, ultra-fast computers. It is used for number crunching supercomputers such as Crays, but can include more conventional big commercial mainframes”. Often used concerning IBM mainframes, when discussing their survival after the invention of lower cost Unix computing systems. More recently the term also applies to highly efficient computer servers and ranches, whose steel racks naturally work in the same manner.

Supercomputers are known to be the world’s fastest and largest computers, and they are primarily used for complex scientific calculations. There are similar components in a supercomputer and desktop computer: they both have memory processors and hard-drives. Although similarities exist between supercomputers and desktop computers, the speeds are significantly different. Supercomputers are way faster and more extensive. The supercomputers large disk storage, high memory, and processors increase the speed and the power of the machine. Although desktop computers can perform thousands or millions of floating-point operations per second know as (megaflops), supercomputers speeds perform at billions of operations per second also known as (gigaflops) and even up to trillions of operations per second know as (teraflops).

Mainframe Computers

Mainframe Computers

Evolution Of Mainframe and Supercomputers

Currently, many computers are indeed faster than the very first supercomputer, the Cray-1, which is designed and developed by Cray Research team during the mid-70s. The Cray-1 had the capacity of computing at the rate of 167 megaflops using a rapid form of computing called the Vector Processing,   which is composed of quick execution of instructions in a state of pipelined fashion. In the mid-80s a faster method of supercomputing was originated: which was called Parallel Processing.  Applications that made use of parallel processing were and are still able to solve computational issues by using multiple processors. Example: if you were going to prepare ice cream, sundaes for nine of your friends. You would need ten scoops of ice cream, ten bowls; ten drizzles of chocolate syrup with ten cherries, working alone you would put one scoop of ice-cream in each bowl and drizzle the syrup on each other. Now, this method of preparing sundaes will be categorized as vector processing. To get the job done very quickly, you will need help from some friends to assist you in a parallel processing method. If five people prepare the ice-cream mixture, it would be five times as fast.

Parallel Processing

Parallel Processing

Application Of Mainframe and Supercomputers

Supercomputers are very powerful that they can provide researchers with the insight into sources that are small, too fast, too big, or maybe very slow to observe in laboratories. Astrophysicists make use of supercomputers as time machines to explore the past and the future of the universe. A fascinating supercomputer simulation was created in the year 2000 that was able to depict the collision of two galaxies: The Andromeda and our very own Milky Way, although this collision will not happen in another 3 billion years from now.

This particular simulation allowed scientist to experiment and the view the result now. The simulation was conducted by Blue Horizon, a parallel supercomputer in the Diego, Supercomputer Center. Using 256 of Blue Horizon’s 1,152 processors, the simulation showed what would happen to millions of stars if the galaxies collided. Another example is molecular dynamic (molecular interactions with each other). Simulation events done with supercomputers allow scientists to study their interactions when two molecules are docked down. Researchers can generate an atom-by-atom picture of the molecular geometry by determining the shape of a molecule’s surface. Atomic experimentation at this level is extremely difficult or impossible to perform in a laboratory environment, but supercomputers have paved the way for scientists to stimulate such behaviors with ease.

Supercomputers Of The Future

Various research centers are always diving into new applications such as data mining to explore additional and multiple uses of supercomputing. Data mining allows scientist to find previously unknown relationships among data, just like the Protein Data Bank at San Diego Supercomputer Center is collecting scientific data that provides other scientists all around the world with more significant ways of understanding of biological systems. So this will provide researchers with new and unlimited insights of the effects, causes, and treatments of so many diseases. Capabilities of and applications of supercomputers will continue to grow as institutions all over the world are willing to share their various discoveries making researchers more proficient at parallel processing.

information technology Data Storage

Electronic data storage, which is widely used in modern computers today, has a date that spans from World War II when a delay memory line was developed to remove the interference from radar signals. We also have the William tube, which was the very first random-access digital storage, based on the cathode ray tube which consumed more electrical power. The problem regarding this was that the information stored in the delay line memory was liable to change flexibly and fast, especially very volatile. So it had to be continuously refreshed, and information was lost whenever power was removed. The first form of non-volatile computer storage system was the magnetic drum, which was the magnetic drum, it was invented in 1932 and used in the (Ferranti Mark 1) the very first commercially available electronic that was for general-purpose.

IBM initially introduced the very first hard disk drive in 1956, as an added component to their 305 RAMAC computer system. Most digitalized data today are stored magnetically on a hard disk, or optically such as CD-ROMS. But in 2002 the digital storage capacity exceeded analog for the first time. In the year 2007, almost 94% of data stored in the world was digitally held: 28% optical devices, 52% hard disks, 11% digital magnetic tape. The worldwide capacity for storing information on electronic devices grew from 3 Exabyte (1986) to 295 Exabyte (2007), doubling every three years. 

Cloud Computing

Cloud Computing

Cloud Storage

Cloud storage is a modern data storage system in which the digital data is stored in an array of logical pools, the physical storage system composes of multiple servers and often various locations, and the environment is usually owned by and managed by a hosting company. Cloud storage supplying companies are in charge of for keeping the data available and accessible, individuals; organizations lease or buy storage capacity from the suppliers to store user, application data or organization. Cloud storage refers to a hosted object-storage service, I a long run the term has broadened to include other sources of data storage systems that are available as a service, just like extended storage.  Examples of block storage services are Amazon S3 and Microsoft Azure storage. Then we also have OceanStore and VISION cloud which are storage systems that can be hosted and also deployed with cloud characteristics.

Cloud computing is changing implementation and design of IT infrastructures. Typically, business-owned traditional database centers are mostly private, and capital-intensive resources (Big-Iron: Mainframe and supercomputers), cloud base computing, on the other hand, enables organizations to have access to cloud base service providers with credible data center infrastructure for a mostly avoidable fee. Infrastructure-as-a-service model, cloud computing, allows flexible data storage on demand. Consumers can beseech cloud service provider’s to help store, compute, and offer other IT related services without installing gadgets and other resources locally, saving a lot of space and money while users can quickly adjust cloud base usage depending on required workload.

Servers

On a typical day, people tend to use different IT-based servers or networks. Firstly, the process of checking your email, over a Wi-Fi connection on your PC, in your house, is a typical server.

The process of logging on to your computer at your place of work, to have access to files from the company’s database that is another typical server. When you are out for coffee the Wi-Fi hotspot at the coffee shop, is another type of server-based communications.

All of these typical servers are set up differently. Servers are mainly categorized according to a geographic area of use and the requirements of the server within those geographic areas. Servers can service just about anyone from one man usage within with one device to millions of people and devices anywhere on the planet.

Some Common Servers we will consider Include:

  • WAN (Wide Area Network)
  • LAN (Local Area Network)
  • PAN (Personal Area Network)
  • MAN (Metropolitan Area Network)

Let’s go into some detail on these networks.

Area Net Relative Size Relationship

Area Net Relative Size Relationship

PAN (Personal Area Network)

PAN (personal area network), is a server integrated for a single person within a building or nearby. It could be inside a little office or a home. A PAN could incorporate at least one PC, phones, minor gadgets, computer game consoles and other gadgets. On the off chance that various people utilize a similar system inside a home, the system is some of the time known as a HAN (Home Area Network).

In an exceptionally common setup, a home will have a single, wired Internet connection associated with a modem. This modem at that point gives both wired and remote service for numerous gadgets. The system is regularly managed from a PC yet can be accessed to from other electronic devices.

This kind of server gives incredible adaptability. For instance, it enables you to:

  • Send a report to the printer in the workplace upstairs while you’re perched in another room with your portable workstation
  • Upload the pictures from your mobile phone to storage device (cloud) associated with your desktop PC
  • View movies from an internet streaming platform on your TV

If this sounds well-known to you, you likely have a PAN in your home without you knowing what it’s called.

LAN (Local Area Network)

LAN (Local Area Network) comprises of a PC network at a single location, regularly an individual office building. LAN is useful for sharing assets, for example, information stockpiling and printers. LANs can be worked with generally modest equipment, for example, network connectors, hubs, and Ethernet links.

A small LAN server may just utilize two PCs, while bigger LANs can oblige a higher number of PCs. A LAN depends on wired networking for speed and security optimization; however wireless networks can be associated with a LAN. Fast speed and moderately low cost are the qualifying attributes of LANs.

LANs are regularly utilized for a place where individuals need to share resources and information among themselves yet not with the outside world. Think about an office building where everyone ought to have the capacity to get to records on the server or have the ability to print an archive to at least one printer. Those assignments ought to be simple for everyone working in a similar office, yet you would not want someone strolling into the office and have access.

 

MAN (Metropolitan Area Network)

MAN (metropolitan area network) comprises of a PC organize over a whole city, school grounds or little district. Contingent upon the arrangement, this kind of system can cover a range from 5 to around 50 kilometers over. A MAN is often used to associate a group of LANs together to form a broader system. When this kind of server is mainly intended for a campus, it can be called CAN (Campus Area Network).

WAN (Wide Area Network)

WAN (wide area network), involves a vast region, for example, a whole nation or the entire world. A WAN can contain various littler systems, for example, LANs or MANs. The Internet is the best-known case of an open WAN.

Conclusion

The world is changing rapidly as modern world continues its unstoppable growth. With so much of the changes happening its good education be capable of touching students in various ways. Students today are leaders, teacher’s inventors and businessmen and women of tomorrow. Information technology has a crucial role in students being able to retain their job and go to school. Especially now that most schools offer various online courses, classes that can be accessed on tablets laptops and mobile phones.

Information technology is reshaping many aspects of the world’s economies, governments, and societies.  IT provide more efficient services, catalyze economic growth, and strengthen social networks, with about 95% of the world’s population now living in an area with the presence of a featured use and implementation of IT. IT is diversified, what you are probably using to have access to this article is based on IT architectural features. Technological advancement is a positive force behind growth in economies of nations, citizen engagement, and job creation.

DataStage – How to use single quoted parameter list in an Oracle Connector

Data Integration

Data Integration

While working with a client’s 9.1 DataStage version, I ran into a situation where they wanted to parameterize SQL where clause lists in an Oracle Connector stage, which honestly was not very straight forward to figure out.  First, if the APT_OSL_PARAM_ESC_SQUOTE is not set and single quotes are used in the parameter, the job creates unquoted invalid SQL when the parameter is populated.  Second, I found much of the information confusing and/or incomplete in its explanation.   After some research and some trial and error, here is how I resolved the issue.  I’ll endeavor to be concise, but holistic in my explanation.

When this Variable applies

This where I know this process applies, there may be other circumstances to which is this applicable, but I’m listing the ones here with which I have recent experience.

Infosphere Information Server Datastage

  • Versions 91, 11.3, and 11.5

Oracle RDBMS

  • Versions 11g and 12c

Configurations process

Here is a brief explanation of the steps I used to implement the where clause as a parameter.  Please note that in this example, I am using a job parameter to populate on a portion of the where clause, you can certainly pass the entire where clause as a parameter, if it is not too long.

Configure Project Variable in Administrator

  • Add APT_OSL_PARAM_ESC_SQUOTE to project in Administrator
  • Populate the APT_OSL_PARAM_ESC_SQUOTE Variable \
APT_OSL_PARAM_ESC_SQUOTE Project Variable

APT_OSL_PARAM_ESC_SQUOTE Project Variable

Create job parameter

Following your project name convention or standard practice, if you customer and/or project do not have established naming conventions, create the job parameter in the job. See jp_ItemSource parameter in the image below.

Job Parameter In Oracle Connector

Job Parameter In Oracle Connector

Add job parameter to Custom SQL in Select Oracle Connector Stage

On the Job parameter has been created, add the job parameter to the SQL statement of the job.

Job Parameter In SQL

Job Parameter In SQL

Related References

IBM Knowledge Center > InfoSphere Information Server 11.5.0

Connecting to data sources > Databases > Oracle databases > Oracle connector

IBM Support > Limitation of the Parameter APT_OSL_PARAM_ESC_SQUOTE on Plugins on Parallel Canvas

IBM Knowledge Center > InfoSphere Information Server 11.5.0

InfoSphere DataStage and Quality > Stage > Reference > Parallel Job Reference > Environment Variables > Miscellaneous > APT_OSL_PARAM_ESC_SQUOTE

 

OLTP vs Data Warehousing

Database, OLTP vs Data Warehousing

Database

OLTP Versus Data Warehousing

I’ve tried to explain the difference between OLTP systems and a Data Warehouse to my managers many times, as I’ve worked at a hospital as a Data Warehouse Manager/data analyst for many years. Why was the list that came from the operational applications different than the one that came from the Data Warehouse? Why couldn’t I just get a list of patients that were laying in the hospital right now from the Data Warehouse? So I explained, and explained again, and explained to another manager, and another. You get the picture.
In this article I will explain this very same thing to you. So you know  how to explain this to your manager. Or, if you are a manager, you might understand what your data analyst can and cannot give you.

OLTP

OLTP stands for OLine Transactional Processing. With other words: getting your data directly from the operational systems to make reports. An operational system is a system that is used for the day to day processes.
For example: When a patient checks in, his or her information gets entered into a Patient Information System. The doctor put scheduled tests, a diagnoses and a treatment plan in there as well. Doctors, nurses and other people working with patients use this system on a daily basis to enter and get detailed information on their patients.
The way the data is stored within operational systems is so the data can be used efficiently by the people working directly on the product, or with the patient in this case.

Data Warehousing

A Data Warehouse is a big database that fills itself with data from operational systems. It is used solely for reporting and analytical purposes. No one uses this data for day to day operations. The beauty of a Data Warehouse is, among others, that you can combine the data from the different operational systems. You can actually combine the number of patients in a department with the number of nurses for example. You can see how far a doctor is behind schedule and find the cause of that by looking at the patients. Does he run late with elderly patients? Is there a particular diagnoses that takes more time? Or does he just oversleep a lot? You can use this information to look at the past, see trends, so you can plan for the future.

The difference between OLTP and Data Warehousing

This is how a Data Warehouse works:

How a Data Warehouse works

How a Data Warehouse works

The data gets entered into the operational systems. Then the ETL processes Extract this data from these systems, Transforms the data so it will fit neatly into the Data Warehouse, and then Loads it into the Data Warehouse. After that reports are formed with a reporting tool, from the data that lies in the Data Warehouse.

This is how OLTP works:

How OLTP works

How OLTP works

Reports are directly made from the data inside the database of the operational systems. Some operational systems come with their own reporting tool, but you can always use a standalone reporting tool to make reports form the operational databases.

Pro’s and Con’s

Data Warehousing

Pro’s:

  • There is no strain on the operational systems during business hours
    • As you can schedule the ETL processes to run during the hours the least amount of people are using the operational system, you won’t disturb the operational processes. And when you need to run a large query, the operational systems won’t be affected, as you are working directly on the Data Warehouse database.
  • Data from different systems can be combined
    • It is possible to combine finance and productivity data for example. As the ETL process transforms the data so it can be combined.
  • Data is optimized for making queries and reports
    • You use different data in reports than you use on a day to day base. A Data Warehouse is built for this. For instance: most Data Warehouses have a separate date table where the weekday, day, month and year is saved. You can make a query to derive the weekday from a date, but that takes processing time. By using a separate table like this you’ll save time and decrease the strain on the database.
  • Data is saved longer than in the source systems
    • The source systems need to have their old records deleted when they are no longer used in the day to day operations. So they get deleted to gain performance.

Con’s:

  • You always look at the past
    • A Data Warehouse is updated once a night, or even just once a week. That means that you never have the latest data. Staying with the hospital example: you never knew how many patients are in the hospital are right now. Or what surgeon didn’t show up on time this morning.
  • You don’t have all the data
    • A Data Warehouse is built for discovering trends, showing the big picture. The little details, the ones not used in trends, get discarded during the ETL process.
  • Data isn’t the same as the data in the source systems
    • Because the data is older than those of the source systems it will always be a little different. But also because of the Transformation step in the ETL process, data will be a little different. It doesn’t mean one or the other is wrong. It’s just a different way of looking at the data. For example: the Data Warehouse at the hospital excluded all transactions that were marked as cancelled. If you try to get the same reports from both systems, and don’t exclude the cancelled transactions in the source system, you’ll get different results.

online transactional processing (OLTP)

Pro’s

  • You get real time data
    • If someone is entering a new record now, you’ll see it right away in your report. No delays.
  • You’ve got all the details
    • You have access to all the details that the employees have entered into the system. No grouping, no skipping records, just all the raw data that’s available.

Con’s

  • You are putting strain on an application during business hours.
    • When you are making a large query, you can take processing space that would otherwise be available to the people that need to work with this system for their day to day operations. And if you make an error, by for instance forgetting to put a date filter on your query, you could even bring the system down so no one can use it anymore.
  • You can’t compare the data with data from other sources.
    • Even when the systems are similar. Like an HR system and a payroll system that use each other to work. Data is always going to be different because it is granulated on a different level, or not all data is relevant for both systems.
  • You don’t have access to old data
    • To keep the applications at peak performance, old data, that’s irrelevant to day to day operations is deleted.
  • Data is optimized to suit day to day operations
    • And not for report making. This means you’ll have to get creative with your queries to get the data you need.

So what method should you use?

That all depends on what you need at that moment. If you need detailed information about things that are happening now, you should use OLTP.
If you are looking for trends, or insights on a higher level, you should use a Data Warehouse.

 Related References

 

 

What is Information Management?

Information Management (IM)

Information Management (IM)

Information Management Definition

Information Management (IM) tends to vary a based on your business perspective, but is all the systems, processes, practice (business and technical) within organizations for the creation, use, and disposal of business information to support business operations.

Information Management (IM) Activities

Information Management activities may include, but are not be limited to:

  • Information creation, capture, storage, and disposal
  • The governance of information, practices, meaning and usage
  • Information protection, Regulatory compliance, privacy, and limiting legal liability
  • Technological infrastructure, such as, architecture, strategies and delivery enablement

Related References

 

SFDC – Using a timestamp literal in a where clause

Salesforce Connector

Salesforce Connector

Working with timestamp literals in the Infosphere SFDC Connector soql is much like working date literals.  So, here a quick example which may save you some time.

SOQL Timestamp String Literals Where Clause Rules

Basically, the timestamp pattern is straight forward and like the process for dates, but there are some differences. The basic rules are for a soql where clause:

  • No quotes
  • No functions
  • No Casting function, or casting for the where soql where clause to read it
  • It only applies to datetime fields
  • A Timestamp identifier ‘T’
  • And the ISO 1806 time notations

Example SOQL Timestamp String Literals

So, here are a couple of timestamp string literal examples in SQL:

  • 1901-01-01T00:00:00-00:00
  • 2016-01-31T00:00:00-00:00
  • 9999-10-31T00:00:00-00:00

Example SQL with Timestamp String Literal Where Clause

 

Select e.Id,

e.AccountId,

e.StartDateTime

From Event e

WHERE e.StartDateTime > 2014-10-31T00:00:00-00:00

 

Related References

Salesforce Developer Documentation

Home, Developer Documentation, Force.com SOQL and SOSL Reference

https://developer.salesforce.com/docs/atlas.en-us.soql_sosl.meta/soql_sosl/sforce_api_calls_soql_select_dateformats.htm

Salesforce Workbench

Home, Technical Library, Workbench

W3C

Date Time Formats

 

What is an ERP?

Enterprise Resource Planning (ERP)

Enterprise Resource Planning (ERP)

What does ERP mean?

  • ERP Means “Enterprise Resource Planning”

What is an ERP?

  • An ERP is business software application or series of applications, which facilitate the daily operations of business. An ERP an be commercial-off-the-shelf (COTS) applications (which may or may not be customized) or custom built (home grown) by the business and/or assemblages of different vendor applications and/or models.  ERP applications dules from a variety of vendors.

Common ERP Major Functions

  • ERP application software typically support these major business operations:

Financials Management system (FMS)

  • FMS supports accounting, consolidation, planning, and procurement.

Customer Relationship Management (CRM)

  • CRM facilitates customer interactions and data throughout the customer lifecycle, with the goal of improving business relationships with customers, assisting in customer retention and sales growth.

Human Resources Management System (HRMS)

  • HRMS supports workforce acquisition, workforce management, workforce optimization, and benefits administration

Enterprise Learning Management (ELM)

  • ELM is the integrated application which increases workforce knowledge, and skills, and competencies to achieve critical organizational objectives.

Asset Management (AM)

  • AM support activities for deploying, operating, maintaining, upgrading, and disposing of assets cost-effectively.

Supply Chain management (SCM)

  • SCM is the oversight of materials, information, and finances as they move in a process from supplier to manufacturer to wholesaler to retailer to consumer.

Related References

Salesforce – Useful Links

Salesforce Connector,SFDC, salesforce.com, CRM, ERP, useful links

Salesforce

 

Here are few SFDC link, which I have found to be useful.

Salesforce Developers

Home, Developer Documentation, Force.com SOQL and SOSL Reference

IBM Knowledge Center

InfoSphere Information Server, InfoSphere Information Server 11.5.0, Connecting to data sources, Enterprise applications, IBM InfoSphere Information Server Pack for Salesforce.com

Salesforce Developers

Home, Technical Library, Workbench

Salesforce Developers

Home, Technical Library, Force.com Tools and Toolkits

Related References

 

 

What does CRM Mean?

Customer Relationship Management (CRM)

Customer Relationship Management (CRM)

 

What is CRM?

CRM (customer relationship management) is a type of ERP application, which are used to facilitate sales, marketing, and business development interactions throughout the customer life cycle.

What does a CRM Application do?

A CRM application capabilities, broadly, encompass:

Marketing Integration

  • Lead management, email marketing, and campaign management

Sales Force Automation

  • Contact management, pipeline analysis, sales forecasting, and more

Customer Service & Support

  • Ticketing, knowledge management systems, self-service, and live chat

Field Service Management

  • Scheduling, dispatching, invoicing, and more

Call Center Automation

  • Call routing, monitoring, CTI, and IVR

Help Desk Automation

  • Ticketing, IT asset management, self-service and more

Channel Management

  • Contact and lead management, partner relationship management, and market development funds management

Business analytics integration

  • Analytics application and Business intelligence and reporting integration, which may include internal reporting capabilities.

Related References

SFDC – Using a date literal in a where clause

Salesforce Connector

I found working with date literal, when working with the Infosphere SFDC Connector soql, to be counterintuitive for me.  At least as I, normally, as I use SQL.  I spent a little time running trials in Workbench, before I finally locked on to the ‘where clause’ criteria data pattern.  So, here a quick example.

SOQL DATE String Literals Where Clause Rules

Basically, the date pattern is straight forward. The basic rules are for a soql where clause:

  • No quotes
  • No functions
  • No Casting function, or casting for the where soql where clause to read.

Example SOQL DATE String Literals

So, here are a couple of date string literal examples in SQL:

  • 1901-01-01
  • 2016-01-31
  • 9999-10-31

Example SQL with Date String Literal Where Clause

 

Select

t.id,

t.Name,

t.Target_Date__c,

t.User_Active__c

From Target_and_Segmentation__c t

where t.Target_Date__c > 2014-10-31

 

Related References

Salesforce Developer Documentation

Home, Developer Documentation, Force.com SOQL and SOSL Reference

https://developer.salesforce.com/docs/atlas.en-us.soql_sosl.meta/soql_sosl/sforce_api_calls_soql_select_dateformats.htm

Salesforce Workbench

Home, Technical Library, Workbench

 

InfoSphere / Datastage – What are The support Connectors stages for dashDB?

dashDB

dashDB

In a recent discussion, the question came up concern which Infosphere Datastage connectors and/or stages are supported by IBM for dashDB.  So, it seems appropriate to share the insight gained from the question being answered.

What Datastage Connectors and/or stages are Supported for dashDB

You have three choices as to connectors, which may best meet you your needs based on the nature of your environment and the configuration chooses which have been applied:

  1. The DB2 Connector Stage
  2. The JDBC Connector stage
  3. The ODBC Stage

Related References

Connecting to IBM dashDB

InfoSphere Information Server, InfoSphere Information Server 11.5.0, Information Server on Cloud offerings, Connecting to other systems, Connecting to IBM dashDB

DB2 connector

InfoSphere Information Server, InfoSphere Information Server 11.5.0, Connecting to data sources, Databases, IBM DB2 databases, DB2 connector

ODBC stage

InfoSphere Information Server, InfoSphere Information Server 11.5.0, Connecting to data sources, Older stages for connectivity, ODBC stage

JDBC data sources

InfoSphere Information Server, InfoSphere Information Server 11.5.0, Connecting to data sources, Multiple data sources, JDBC data sources

Database – What is a Composite Primary Key?

Database Table

Database Table

What is a Composite Primary Key?

A Composite Primary key is Primary key What a primary key, which is defined by having multiple fields (columns) in it.  Like a Primary Key what a composite Primary Key is depends on the database.  Essentially a Composite Primary Key:

  • Is a combination of Fields (columns) which uniquely identifies every row.
  • Is an index in database systems which use indexes for optimization
  • Is a type of table constraint
  • Is applied with a data definition language (DDL) alter command
  • And may define parent-Child relationship between tables

Related References

What is the convert function in Datastage?

Algorithm

Algorithm

 

What is the convert function in Datastage?

In its simplest form, the convert function in Infosphere DataStage is a string replacement operation.  Convert can be used to replace a specific character, a list of characters, or a unicode character (e.g. thumbs Up Sign or Grinning Face).

Convert Syntax

convert(‘<<Value to be replaced’,'<<Replacement value >>’,<<Input field>>)

Using the Convert Function to remove a list of Characters

Special Characters in DataStage Handles/converts special characters in a transformer stage, which can cause issues in XML processing and certain databases.

Convert a list of General Characters

Convert(“;:?\+&,*`#’$()|^~@{}[]%!”,”, TrimLeadingTrailing(Lnk_In.Description))

Convert Decimal and Double Quotes

Convert(‘ ” . ‘,”, Lnk_In.Description)

Convert Char(0)

This example replaces Char(0) with nothing essentially removing it as padding and/or space.

convert(char(0),”,Lnk_In.Description)

 

Related References

String functions

InfoSphere Information Server, InfoSphere Information Server 11.5.0, InfoSphere DataStage and QualityStage, Developing parallel jobs, Parallel transform functions, String functions

Oracle Database – Useful links

Oracle Database

Oracle Database

Some useful oracle documentation references.

Reference Type Link

Database SQL Language Reference (11.2)

https://docs.oracle.com/cd/E11882_01/server.112/e41084/

Database SQL Language Reference (11.2)

https://docs.oracle.com/cd/E11882_01/server.112/e16604/toc.htm
Database SQL Language Reference (12c) https://docs.oracle.com/database/121/SQLRF/toc.htm

Database (12c) SQL*Plus User’s Guide and Reference

https://docs.oracle.com/database/121/SQPUG/toc.htm

SQL Developer Documentation

http://docs.oracle.com/cd/E12151_01/

Database – What is DDL?

SQL (Structured Query Language), Database, What is DDL?

SQL (Structured Query Language)

What is DDL (Data Definition Language)?

DDL (Data Definition Language), are the statements used to manage tables, schemas, domains, indexes, views, and privileges.  The the major actions performed by DDL commands are: create, alter, drop, grant, and revoke.

 

Related References

Data Modeling – Fact Table Effective Practices

Database Table

Database Table

Here are a few guidelines for modeling and designing fact tables.

Fact Table Effective Practices

  • The table naming convention should identify it as a fact table. For example:
    • Suffix Pattern:
      • <<TableName>>_Fact
      • <<TableName>>_F
    • Prefix Pattern:
      • FACT_<TableName>>
      • F_<TableName>>
    • Must contain a temporal dimension surrogate key (e.g. date dimension)
    • Measures should be nullable – this has an impact on aggregate functions (SUM, COUNT, MIN, MAX, and AVG, etc.)
    • Dimension Surrogate keys (srky) should have a foreign key (FK) constraint
    • Do not place the dimension processing in the fact jobs

Related References

Database – What is DML?

SQL (Structured Query Language), Database, What is DML?

SQL (Structured Query Language)

What is DML (Data Manipulation Language)?

As the name indicates, Data manipulation for working with information inside a database structure.  There are four main DML commands:

  • Select – reading data rows
  • Insert – adding data rows
  • update – changing values within data rows
  • Delete – removing data row

Related References