We have moved to www.dataGenX.net, Keep Learning with us.

Monday, December 23, 2013

How to Schedule Jobs and Sequencers via Datastage Scheduler


Datastage Native Job Scheduler

Datastage includes a scheduling option, it does not have its own. DataStage doesn't include a "scheduler" so leverages the underlying O/S. For UNIX that means cron and a check of the crontab entries for the scheduling user will have what you need. DataStage leverages cron for recurring schedules and at for 'one off' schedules. For Windows, it uses scheduled tasks of Windows. 

From the operating system command line, logged in as the scheduling user, a "crontab -l" will list the scheduled jobs.

Thursday, December 19, 2013

Hash Files in DataStage


Guys, Sharing a nice & detailed information on HASH files, Its old but betterto know.

What Exactly Are Hash Files?


DataStage® Server Edition utilizes several different types of hash files. The default and most versatile type is a dynamic (more precisely a type 30) hash file. A hash file can simply be described as a file that distributes data throughout a pre-sized and space allocated file. Every row has to have a key to determine where the row will reside within the file. This key is run thru an algorithm (technically a hashing algorithm, hence the name) to determine the location in the file for the row. One and only one row can exist at a single location.

Thursday, December 12, 2013

Interview Questions : DataStage - self-3


100    If 1st and 8th record is duplicate then which will be skipped? Can you configure it?
101    How do you import and export datastage jobs? What is the file extension? (See each component while importing and exporting).
102    How do you rate yourself in DataStage?
103    Explain DataStage Architecture?
104    What is repository? What are the repository items?
105    What is difference between routine and transform?
106    When you write the routines?

Wednesday, December 11, 2013

A SQL Client Tool - TeraData Studio Express



Teradata Studio Express is a graphical Java program, developed on the Eclipse Rich Client Platform (RCP),  that will allow you to view the structure of a JDBC compliant database, browse the data in tables, issue SQL commands etc.
This is a nice tool which can connect many DBs like Aster database, DB2 for LUW, DB2 for i5/OS, DB2 for z/OS, Oracle, SQL Server, Generic JDBC connection and of course TeraData DB.


DataStage Scenario - Problem6


Goal : Get the count of Vowels in Columns

Input :

Akash Aggrawal
Priya Awasthi  
Anil chahal    
Diya Singh    
Kashish Patel 
Sunil Verma    
Rashid Patel    
Rashmi Arya   
Gopal Joshi     
Neha Tomar    

Tuesday, December 10, 2013

GRE Word List Flash Card



A flash card is any of a set of cards bearing information, as words, on both sides, used in classroom drills or in private study. Flashcards are widely used as a learning drill to aid memorization. We have created online version of flash cards for students to memorize GRE vocabulary for free. You can find a word in the front and an answer overleaf.
Click on the card below to see the meaning and contextual use of the word.


Monday, December 09, 2013

DataStage Scenario - Problem5


Goal : Count the occurrence of char

i/p
1,a
2,b
3,a
4,b
5,a
6,a
7,b
8,a

Friday, December 06, 2013

List of Environment Variables in DataStage



General Job Administration

APT_CHECKPOINT_DIR
APT_CLOBBER_OUTPUT
APT_CONFIG_FILE
APT_DISABLE_COMBINATION
APT_EXECUTION_MODE
APT_ORCHHOME
APT_STARTUP_SCRIPT
APT_NO_STARTUP_SCRIPT
APT_STARTUP_STATUS
APT_THIN_SCORE

Wednesday, December 04, 2013

How to setup environment variables in DataStage


In DataStage you can set environment variables using three different methods. The right method depends on how often you need to change the value of an environment variable and how specific this value is for a particular project or job. Here is a description of these methods:

Tuesday, December 03, 2013

Error : Datastage Job Aborts with "The record is too big to fit in a block"


To fix this error you need to increase the block size to accommodate the record size:

1. Log into Designer and open the job.

2. Open the job properties--> parameters-->add environment variable and select: 
   APT_DEFAULT_TRANSPORT_BLOCK_SIZE