US-NJ-Jersey City (will consider relocating)
• Over 7 years of experience in Data modeling, Datawarehouse Design, Development and Testing using ETL and Data Migration life cycle using IBM WebSphere DataStage 8.x/7.x • Expertise in building Operational Data Store (ODS), Data Marts, and Decision Support Systems (DSS) using Multidimensional Model(Kimball and Inmon),Star and Snowflake schema design. • Experience in analyzing the data generated by the business process, defining the granularity, source to target mapping of the data elements, creating Indexes and Aggregate tables for the data warehouse design and development. • Data Processing experience in designing and implementing Data Mart applications, mainly transformation processes using ETL tool DataStage (Ver8.0/7), designing and developing jobs using DataStage Designer, Data Stage Manager, DataStage Director and DataStage Debugger. • Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data Conversion, Performance Tuning and System Testing. • Excellent in using highly scalable parallel processing Infrastructure using DataStage Parallel Extender. • Efficient in incorporation of various data sources such as Oracle, MS SQL Server, and DB2, Sybase, XML and Flat files into the staging area. • Experience in Mapping Server/parallel Jobs in DataStage to populate tables in Data warehouse and Data marts. • Proven track record in addressing production issues like performance tuning and enhancement. • Excellent knowledge in creating and managing Conceptual, Logical and Physical Data Models. • Experience in dimensional and relational database design. • Strong in Data warehousing concepts, dimensional Star Schema and Snowflakes Schema methodologies. • Expert in unit testing, system integration testing, implementation, maintenance and performance tuning. • Experience in different Scheduling tools like AutoSys for automating and scheduling jobs run. • Excellent with PL/SQL, T-SQL, Stored Procedures, Database Triggers and SQL * Loader. • Experience in UNIX Shell Scripting. • Excellent knowledge of operating systems Windows, UNIX, Macintosh, and databases including Oracle, SQL Server,and DB2. • Experience in implementing Quality Processes like ISO 9001:2000/Audits. • Detail oriented with good problem solving, organizational, analysis, highly motivated and adaptive with the ability to grasp things quickly. • Ability to work effectively and efficiently in a team and individually with excellent interpersonal, technical and communication skills.
Education Qualifications: Masters in Electrical and Computer Engineering.
Skill Sets: IBM Information Server V8.1(DataStage, QualityStage, Information Analyzer), Ascential DataStage V7.5 (Designer, Director, Manager, Parallel Extender)., Oracle 8i/9i/10g, MS SQL Server 2005/2008, DB2 UDB, MS Access, Sybase, SQL, PL/SQL, SQL*Plus, Flat files, Sequential files, TOAD 9.6, Erwin, Microsoft Visio, Oracle Developer 2000, SQL*Loader, IBM Cognos 8.0, IBM AIX UNIX, Red Hat Enterprise Linux 4, UNIX Shell Scripting, Windows NT,/XP, Macintosh,C,C++,VB scripting.
Prudential Financial, Newark,NJ 1/2009- Present DataStage Developer Prudential Financial, Inc. is a Fortune Global 500, provides insurance, investment management, and other financial products and services to both retail and institutional customers throughout the United States and in over 30 other countries. The project objective was to collect, organize and store data from different operational data sources to provide a single source of integrated and historical data for the purpose of reporting, analysis and decision support to improve the client services.
Hardware/Software: IBM DataStage 8.0 (Designer, Director, Manager, Parallel Extender), Oracle 10g,SQL Server 2008, DB2 UDB, Flat files, Sequential files, Autosys, TOAD 9.6, SQL*Plus, AIX UNIX, IBM Cognos 8.0
Responsibilities: • Interacted with End user community to understand the business requirements and in identifying data sources. • Analyzed the existing informational sources and methods to identify problem areas and make recommendations for improvement. This required a detailed understanding of the data sources and researching possible solutions. • Implemented dimensional model (logical and physical) in the existing architecture using Erwin. • Studied the PL/SQL code developed to relate the source and target mappings. • Helped in preparing the mapping document for source to target. • Worked with Datastage Manager for importing metadata from repository, new job Categories and creating new data elements. • Designed and developed ETL processes using DataStage designer to load data from Oracle, MS SQL, Flat Files (Fixed Width) and XML files to staging database and from staging to the target Data Warehouse database. • Used DataStage stages namely Hash file, Sequential file, Transformer, Aggregate, Sort, Datasets, Join, Lookup, Change Capture, Funnel, Peek, Row Generator stages in accomplishing the ETL Coding. • Developed job sequencer with proper job dependencies, job control stages, triggers. • Used QualityStage to ensure consistency, removing data anomalies and spelling errors of the source information before being delivered for further processing. • Excessively used DS Director for monitoring Job logs to resolve issues. • Involved in performance tuning and optimization of DataStage mappings using features like Pipeline and Partition Parallelism and data/index cache to manage very large volume of data. • Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis. • Used Autosys job scheduler for automating the monthly regular run of DW cycle in both production and UAT environments. • Verified the Cognos Report by extracting data from the Staging Database using PL/SQL queries. • Wrote Configuration files for Performance in production environment. • Participated in weekly status meetings.
Kaiser Permanente, Pleasanton, CA 05/2007-- 12/2008 ETL Designer/ DataStage Developer
Kaiser Permanente is an integrated managed care organization, is the largest health care organization in the United States. The Health Plan and Hospitals operate under state and federal non-profit tax status, while the Medical Groups operate as for-profit partnerships or professional corporations in their respective regions.
The project was to design, develop and maintain a data warehouse for their vendor's data, internal reference data and work with their DBA to ensure that the physical build adheres to the model blueprint.
Hardware/Software: DataStage 7.5.1 Enterprise Edition, Quality Stage, Flat files,Oracle10g, SQL Server -2005/2008, Erwin 4.2, PL/SQL, UNIX, Windows NT/XP
Responsibilities: • Involved in understanding of business processes and coordinated with business analysts to get specific user requirements. • Studied the existing data sources with a view to know whether they support the required reporting and generated change data capture request. • Used Quality Stage to check the data quality of the source system prior to ETL process. • Worked closely with DBA's to develop dimensional model using Erwin and created the physical model using Forward Engineering. • Worked with Datastage Administrator for creating projects, defining the hierarchy of users and their access. • Defined granularity, aggregation and partition required at target database. • Involved in creating specifications for ETL processes, finalized requirements and prepared specification document. • Used DataStage as an ETL tool to extract data from sources systems, loaded the data into the SQL Server database. • Imported table/file definitions into the Datastage repository. • Performed ETL coding using Hash file, Sequential file, Transformer, Sort, Merge, Aggregator stages compiled, debugged and tested. Extensively used stages available to redesign DataStage jobs for performing the required integration. • Extensively used DataStage Tools like Infosphere DataStage Designer, Infosphere DataStage Director for developing jobs and to view log files for execution errors. • Controlled jobs execution using sequencer, used notification activity to send email alerts. • Ensured that the data integration design aligns with the established information standards. • Used Aggregator stages to sum the key performance indicators used in decision support systems. • Scheduled job runs using DataStage director, and used DataStage director for debugging and testing. • Created shared containers to simplify job design. • Performed performance tuning of the jobs by interpreting performance statistics of the jobs developed. • Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, regression testing, prepared test data for testing, error handling and analysis.
Macy's, Atlanta,GA 12/2005-- 04/2007 ETL Developer Macy's (NYSE: M) is a chain of mid-to-high range American department stores delivering fashion and affordable luxury to customers coast to coast. Online shopping is offered through macys.com. Its selection of merchandise can vary significantly from location to location, resulting in the exclusive availability of certain brands in only higher-end stores.
The aim of the Project was to build a data warehouse, which would keep historical data according to a designed strategy. Flat files, Oracle tables were part of the source data, which came in on a daily, weekly, monthly basis.
Hardware/Software: IBM Information Server DataStage 7.5, Oracle 10g, SQL, PL/SQL, UNIX, SQL*Loader, Autosys, Business Objects 6.1, Windows 2003, IBM AIX 5.2/5.1, HP Mercury Quality Center 9.0
Responsibilities: • Involved in understanding of business processes and coordinated with business analysts to get specific user requirements. • Used Information Analyzer for Column Analysis, Primary Key Analysis and Foreign Key Analysis. • Extensively worked on DataStage jobs for splitting bulk data into subsets and to dynamically distribute to all available processors to achieve best job performance. • Developed ETL jobs as per business rules using ETL design document • Converted complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance. • Used DataStage maps to load data from Source to target. • Enhanced the reusability of the jobs by making and deploying shared containers and multiple instances of the jobs. • Imported the data residing in the host systems into the data mart developed in Oracle 10g. • Extensively used Autosys for automation of scheduling jobs on daily, bi-weekly, weekly monthly basis with proper dependencies. • Wrote complex SQL queries using joins, sub queries and correlated sub queries • Performed Unit testing and System Integration testing by developing and documenting test cases in Quality Center. • Validated the report generated using Business Objects using PL/SQL queries. • Worked on troubleshooting, performance tuning and performances monitoring for enhancement of DataStage jobs and builds across Development, QA and PROD environments.
Citibank Inc., New York, NY 12/2004--11/2005 DataStage Developer Citibank, the leading global banking company, has some 200 million customer accounts and does business in more than 100 countries, providing services to consumers, corporations, governments and institutions.
The project was to transform the data coming from various sources through multiple stages before being loaded into the data warehouse and maintenance.
Responsibilities: • Involved in understanding of business processes to learn business requirements. • Extracted data from different systems into Source. Mainly involved in ETL developing. • Defined and implemented approaches to load and extract data from database using DataStage. • Worked closely with data warehouse architect and business intelligence analyst in developing solutions. • Used Erwin for data modeling (i.e. modifying the staging and SQL scripts on Oracle and MS Access Environments). • Involved in design, source to target mappings between sources to operational staging targets, using DataStage Designer. • Performed ETL coding using Hash file, Sequential file, Transformer, Sort, Merge, Aggregator stages compiled, debugged and tested. Extensively used stages available to redesign Data Stage jobs for performing the required integration. • Executed jobs through sequencer for better performance and easy maintenance. • Involved in unit, performance and integration testing of Data Stage jobs. • Used Data Stage Director to run and monitor the jobs for performance statistics. • Involved in performance tuning of the jobs. • Used T-SQL for validating the data generated at OLAP server.
Wipro Technologies is an Indian Multinational, a leading provider of integrated business, technology and process solutions on a global delivery platform.
Worked as a QA tester on web based E-billing application. The application has two modules account payable (AP) and account receivable (AR) to keep track of transactions. Hardware/Software: Windows XP,ASP.net,Test Director 7.2,WinRunner 7.0,SQL server 2000
Responsibilities: • Active participation in decision making and QA meetings and regularly interacted with the Business Analysts &development team to gain a better understanding of the Business Process, Requirements & Design. • Analyzed business requirements with perspective to Black Box Testing, system architecture/design and converted them into functional requirements/test cases. • Used Test Director to document the requirements and created traceability matrices for the requirements. • Developed Test Plan that included the scope of the release, entrance and exit criteria and overall test strategy. Created detailed Test Cases and Test sets and executed them manually. • Performed Cross-Browsing testing to verify if the application provides accurate information in different (IE, Netscape, Firefox, Safari) browsers. • Extensively used Output and Checkpoint for verifying the UI properties and values using VB scripting. • Back-End Database verification manually and using WinRunner to automatically • Verify Database with the values entered during automated testing. • Performed the Back-End integration testing to ensure data consistency on front-end by writing and executing SQL Queries. Provided management with metrics, reports, and schedules and was responsible for entering, tracking bugs. • Ensured that the Defect was always written with great level of detail.