SlideShare a Scribd company logo
Mainframe Technology Overview March 2008
Mainframe Technology Overview BluePhoenix Mainframe Architecture C-Scan Toolbox and IT Discovery Repository Files IT Discovery Database and Query Facility
BluePhoenix Mainframe Architecture Integration of the following: Methodology  – Designed to solve the problem systematically with little disruption to the client.  C-Scan  – Conversion engine with its own interpreted language components and exits. ISPF Panels and REXX  – Workflow engine that enforces the methodology and controls processing (batch and online). Mini-Scheduler  – Controls batch job flows. Toolbox  – JCL and programs in C-Scan and REXX for analysis and conversion. Repository  – Set of detailed tables, describing the environment and containing rules for conversion.
BluePhoenix Mainframe Architecture Inventory Reports Generation of Tools Conversion Unit Test System Test Implementation Repository Enhancement Repository Files Specific Cluster Metadata Libraries Generation Parameters Libraries Converted Libraries Implementation   Parameters
C-Scan Assembler Interpreter  – Provides some of the assembler abilities, such as dynamic file allocation and edit of PDS directory. Event Driven Engine  – Driven by record-by-record automated reading, with logic controlled by events or keys. Writes Records  – Creates records with implicit and explicit location setting of fields; is able to write and execute its own code. Flexible  – Has COBOL-like structures for multiple file processing; Record Merge; Subprogram. Easy to Learn  – Simple and consistent syntax.
The Toolbox Global Assessment/Impact Analysis  – Performs inventory definition and analysis for migration or field adjustment projects. FieldEnabler  – Performs conversions of field lengths and formats automatically while protecting the corporate knowledge asset. COBOL/LE-Enabler  – Performs COBOL/LE conversions. DBMSMigrator  – Performs migration from non-relational to relational databases and from their access language to SQL (DBMSMigrator is a PC-based tool).
Converting Programs, Files, Control Statements, and JCL Only C-Scan produced C-Scan scripts are used. The toolbox is self-documenting and controlled, and comments converted source.
Global Assessment and Field Adjustment Global Assessment IT Discovery Product made up of C-Scan programs (and others) used for the static understanding of the application components within a full system environment. Field Adjustment The Toolbox Tools made up of C-Scan programs (and others) used for the analysis and eventual conversion of specific system components through dynamic propagation of fields tracking data within and between programs. Tools and Products C-Scan
Toolbox Selection Screen (1 of 3)
Toolbox Selection Screen (2 of 3)
Toolbox Selection Screen (3 of 3)
Some Repository Files DBJCL DBDSN01 DBSOURCE DBDTSCAN DBPGROUT DBLOAD DBTRAN DBCALL INVLIST
Mainframe Technology Overview BluePhoenix Mainframe Architecture C-Scan Toolbox and IT Discovery Repository Files IT Discovery Database and Query Facility
Entry-points (Events) INIT  - Before the first record of the input. TERM  - After the end of the input file. INITMEM  - Before reading the first record of a member of a library. TERMMEM  - After the last record was read from a member of a library. OUTREC  - After reading each record. HEADER  - At the beginning of a page when generating reports. TRAILER  - At the end of a page when generating reports. KEYS  - When the defined keys are changed.
Power of C-Scan Some of the Available Variables $$WORK Work area for that C-Scan step $$GLOB Work area that bridges multiple C-Scan steps $$RECORD The input record $$OUTREC The output record  $$DSN The DSN on the input DD statement - PDS or SEQ $$MEMBER The member name within a PDS $$PNVMEM The member name of a PANVALET data set $$VOLSER The VOLSER of the input dataset  $$DATE The current date in installation format $$DATER The current date in format “mmm, dd yyyy” $$TIME The current time (HH:MM:SS) $$JOB The name of the job specified on the JOB statement $$STEP The name of the step from the EXEC statement $$UID The UserID that last updated the member
Power of C-Scan INDEX Function Searches for a string in the specified part of the source record, can use specialized wildcards. Returns 0 to indicate false (string not found) or position of the found string. Creates the following variables: $$IDXA The text specified by the INDEX function. $$IDXB The text following the text that was found. $$IDXC The text preceding the text that was found.
Power of C-Scan Some Tables Used to delimitate fields (set pointers) Continue as long as…  ALU Uppercase Alphabetic ALN   Alphanumeric (Uppercase) NUM Number DSN Characters acceptable for dataset names COB Characters acceptable for COBOL field names  N…  Continue as long as it is not … Also specified characters and wildcards
Power of C-Scan Use of Index, Tables, and Wildcards From a source library, create a list of fields with a PIC of X(50), along with the member name //SYSCIN DD * S010_RPT BUILD,DD1=IN010,DD2=OUT010,PARMDD=P010, D=|#+?|,SORTO='(1,8,CH,A,10,32,CH,A)' //P010  DD *  OUTREC (  IF (INDEX(6,72,' ?PIC ?X(50)'),EQ,0)  EXIT(LEAVEREC)  IF ($$IDXC(1,COB,32),EQ,'FILLER')  EXIT(LEAVEREC)  BUILD($$MEMBER,10:$$IDXC(1,COB,32),/)  )
C-Scan in Production  –  JCL
C-Scan in Production – Control Statements
C-Scan in Production – Control Statements
C-Scan in Production  –  XTBLDREC
Mainframe Technology Overview 1. BluePhoenix Mainframe Architecture 2. C-Scan 3. Toolbox and IT Discovery 4. Repository Files 5. IT Discovery Database and Query Facility
The Mainframe Tool Libraries JOB (JFIX; JUSER) – JCL and Mini-Scheduler lists PARM (PFIX; PUSER) – Programs and control tables written in  C-Scan REXX – Mini-Scheduler and Screens JCLFIX – JCL templates for situation specific jobs
Controlling the Flow – Screens
Controlling The Flow – Batch: The Mini-Scheduler All jobs are submitted with the same JOBNAME and JOBCLASS Jobs have two steps added, STEP00 as the first step and STEPFF as the last: STEPFF writes to the LOG that the JOB successfully completed (based on Condition Code less than 5) STEP00 causes a failure (System 806-4) if the previous JOB’s STEPFF did not write to the LOG (based on Run Type) This permits: High Level of Automation High Number of Jobs – easy to debug
What We Need to Know About a Program Language Files Other programs with same files JCL (or OLTP) used Routines called Routines calling Copies Loads DBMS structure  Etc.
What We Need to Know
What We Need to Know: Collect JCL with PROCs Source JCL ITD External Reader OPJCL PROCLIB ITD DBJCL
Cross Referencing Components Cross-referencing loads, JCL, CICS, and source Identifying “missing” entities – sources not run, JCL or TRANS without source There is a turnaround screen to link load and source where names differ
Multi-Language Interface Meta-COBOL #SRV22 JCL; PROCs; PARMs; CSD; DBMS; Load Repository Reports Date Fields xxxx sssss xxxx sssss xxxx sssss Records xxxx sssss xxxx sssss xxxx sssss PL/I Easytrieve Source and Copy COBOL Master Repository Inventory Assembler Other (4 th  GL)
Revise Commands, Conflicts, and Fixes Screen
IT Discovery Provides understanding of the mainframe operational and application environments, their inventory and interrelationships. Accesses, analyses, cross-references software components on an MVS mainframe. Makes sense of the spaghetti that is an MVS application environment. Finds redundancies, missing entities, and inefficient interconnections. Creates a DB2 database on the components reportable through SQL. Includes a network based query facility (QF).
BluePhoenix IT Discovery Static Information Collection OLTP and Batch SOURCE and COPIES IT Discovery Repository Flat Files JCL; PROC; Control Statements LOAD Modules CICS Tables (CSD) IMS/DC Database Definitions DB2; IMS; IDMS Job Schedulers IT Discovery Repository DB2 Reports Queries Network Query Facility Web Queries
Process Stages Output Process Stage Input DB2 Repository Process Relational Database 4 Database parameter definitions Repository Files Updated Repository Reports Turnaround:  Resolve variables containing routine names  Resolve variables containing DD names  Analyze Source Entities 3 Production source programs and copy libraries Repository Files Populated Repository Reports Turnaround:  Resolve Load without corresponding source Analyze Inventory Components 2 Production source programs and copy libraries Production JCL, PROC and Control statement libraries Production Load libraries Production OLTP control data Production DB formats Customized environment Empty Repository datasets Build Environment 1 Product libraries Customer parameters
Analyze Source Entities This step does the in-depth static analysis of the source programs.  ITD builds a Repository file that lists and details entities and relationships from within COBOL programs to the “outside world” including flat files, VSAM, DB2, IMS, IDMS and routines. Other languages are analyzed without routine calls. The Repository file includes detailed information on the interfaces between routines and between programs and data.
Analyze Source Entities Turnaround Screens “Variable Routines” Where a routine is called using a variable field for the name, and that variable is not defined as a constant in the Data Division.  “Variable DDNAMEs” Where a DDNAME is a variable field, and that variable is not defined as a constant in the Data Division.
Query Facility Architecture IT Discovery Repository (DB2) QF Server QF Client QF Client QF Client QF Client
ITD – Incremental Running IT Discovery Collection Scripts Control Datasets Catalog IT Discovery Repository C-Scan Engine DB Definitions Changes, Additions and Deletions ONLY Libraries
Mainframe Technology Overview BluePhoenix Mainframe Architecture C-Scan Toolbox and IT Discovery Repository Files IT Discovery Database and Query Facility
DBSOURCE Dataset (1 of 4) DBSOURCE
DBSOURCE Dataset (2 of 4) DBSOURCE
DBSOURCE Dataset (3 of 4) DBSOURCE
DBSOURCE Dataset (4 of 4) DBSOURCE
IT Discovery – DB2 Repository
IT Discovery – DB2 Repository
Tbinvlist Table (1 of 2)
Tbinvlist Table (2 of 2)
Mainframe Technology Overview BluePhoenix Mainframe Architecture C-Scan Toolbox and IT Discovery Repository Files IT Discovery Database and Query Facility
Query Facility Architecture Thin Client – Users have no software installed and access the server via a browser (intranet) Server – Operates software that:  Provides screens and interfaces to user  Manages users, queries, and access to queries  Passes SQL queries from clients to mainframe (via DB2Connect) and returns results Mainframe – Contains ITD Relational Database (DB2)
Query Facility Users Administrator – Connects repositories; maintains Users, Public, and Shared query folders. Power User – Can create and run own queries in Private folder. Can copy to Shared folder. Can run queries from Public and Shared, and copy to Private. Regular User – Can only run queries from Public folder.
Query Facility Folders Public – Categories and queries available; “Read Only” to everyone; maintained by the Administrator. Shared – Categories and queries available; “Read, Write,  Not  Update” to Power Users (if enabled) and the Administrator. Private – Queries only available; “Read, Write, and Update” to Power Users and Administrator.
Query Facility – Categories and Queries Categories – Groupings of Queries Categories belong in Folders Queries – Use standard DB2 SQL Queries can have drill down menus from specific results to other nested queries
Query Facility – Categories and Queries
Query Facility – Queries
Query Facility – Running and Drilling Queries are run by hitting EXECUTE Parameter is per SQL – avoid using only % Drill down menus are available on underlined results Nested Queries available through the drill down menus
Query Facility – Running Queries
Query Facility – Drilling Down
Query Facility – Nested Query
Query Facility – Filters and Export Results columns can be filtered for viewing Results can be exported to XML; CSV; Excel; HTML
Query Facility – Filtering Results
Query Facility – Exporting Results
Query Facility – Exporting Results
Query Facility – Creating Queries Creation by Power Users and Administrators Must be in Category Can create drill-down menus Can link drill-down menus
Query Facility – Query Creation
Query Facility – Drill-Down Linking
Query Facility – Online Documentation
Query Facility – Online Documentation
Thank You!

More Related Content

PPT
Mainframe Architecture & Product Overview
PPTX
Mahati's PPT Mainframes
PPTX
2018 08-13-ib ms-latest-buzz-share-final
PPTX
Bringing Mainframe Security Information Into Your Splunk Security Operations ...
PDF
Munich 2016 - Z011599 Martin Packer - More Fun With DDF
PDF
Mainframe Fine Tuning - Fabio Massimo Ottaviani
 
PPTX
Altair Pbs Works Overview 10 1 Kiew
PDF
Educational seminar lessons learned from customer db2 for z os health check...
Mainframe Architecture & Product Overview
Mahati's PPT Mainframes
2018 08-13-ib ms-latest-buzz-share-final
Bringing Mainframe Security Information Into Your Splunk Security Operations ...
Munich 2016 - Z011599 Martin Packer - More Fun With DDF
Mainframe Fine Tuning - Fabio Massimo Ottaviani
 
Altair Pbs Works Overview 10 1 Kiew
Educational seminar lessons learned from customer db2 for z os health check...

What's hot (11)

PPTX
Rational Development & Test for z Systems 9.5 Webinar with Rogers Communications
PDF
DB2 for z/OS Architecture in Nutshell
PPT
How To Master PACBASE For Mainframe In Only Seven Days
ODP
DB2 Data Sharing Performance
PDF
zIIP Capacity Planning
PDF
zIIP Capacity Planning - May 2018
PDF
Munich 2016 - Z011598 Martin Packer - He Picks On CICS
PDF
Parallel Sysplex Performance Topics
PDF
Parallel Batch Performance Considerations
PDF
zIIP Capacity Planning
Rational Development & Test for z Systems 9.5 Webinar with Rogers Communications
DB2 for z/OS Architecture in Nutshell
How To Master PACBASE For Mainframe In Only Seven Days
DB2 Data Sharing Performance
zIIP Capacity Planning
zIIP Capacity Planning - May 2018
Munich 2016 - Z011598 Martin Packer - He Picks On CICS
Parallel Sysplex Performance Topics
Parallel Batch Performance Considerations
zIIP Capacity Planning
Ad

Viewers also liked (20)

PPTX
Move to Hadoop, Go Faster and Save Millions - Mainframe Legacy Modernization
PPTX
How to Leverage Mainframe Data with Hadoop: Bridging the Gap Between Big Iron...
PDF
3 webeducation 24sep09-sol_tanguay
PPT
CIF: Etablir et entretenir des alliances en microassurance afin d’atteindre u...
PDF
Web 2.0 for Schools/ Education Institution
PDF
Intelligent Mainframe Management: The Evolution of Expert Systems
PPT
PDF
Mainframe Overview V2
PPTX
Why Enterprise Digital Strategies Must Drive IT Modernization
PDF
Modernizing COBOL Applications with CA GEN
PDF
Web 2.0 Overview
PPTX
Présentation PowerPoint " Conception et développement d'un portail web pour l...
PDF
Mission-Essential Mainframe as Part of Your Innovation Agenda: Strategy and D...
PDF
Cutting-edge Solutions with Mainframe Services
PPT
Conception d'un site web
PPT
Une selection de 200 comptes twitter en banque finance assurance par alban ...
PDF
L'e-assurance en France - Etude des sites
 
PDF
Application Migration - What, When, Why, How?
PDF
[Direct Assurance] 6 principes de Neuromarketing utilisés par Direct Assuranc...
PPTX
Comment transformer WordPress en portail de formation
Move to Hadoop, Go Faster and Save Millions - Mainframe Legacy Modernization
How to Leverage Mainframe Data with Hadoop: Bridging the Gap Between Big Iron...
3 webeducation 24sep09-sol_tanguay
CIF: Etablir et entretenir des alliances en microassurance afin d’atteindre u...
Web 2.0 for Schools/ Education Institution
Intelligent Mainframe Management: The Evolution of Expert Systems
Mainframe Overview V2
Why Enterprise Digital Strategies Must Drive IT Modernization
Modernizing COBOL Applications with CA GEN
Web 2.0 Overview
Présentation PowerPoint " Conception et développement d'un portail web pour l...
Mission-Essential Mainframe as Part of Your Innovation Agenda: Strategy and D...
Cutting-edge Solutions with Mainframe Services
Conception d'un site web
Une selection de 200 comptes twitter en banque finance assurance par alban ...
L'e-assurance en France - Etude des sites
 
Application Migration - What, When, Why, How?
[Direct Assurance] 6 principes de Neuromarketing utilisés par Direct Assuranc...
Comment transformer WordPress en portail de formation
Ad

Similar to Mainframe Technology Overview (20)

PDF
Informatica slides
PPT
Oracle RI ETL process overview.
PPT
OWB11gR2 - Extending ETL
PDF
SQL Performance Tuning and New Features in Oracle 19c
PPTX
User Group3009
PPT
Perfsystems- Consulting Services
PDF
Advanced SQL - Database Access from Programming Languages
PPTX
Google cloud Dataflow & Apache Flink
PPTX
U-SQL Killer Scenarios: Custom Processing, Big Cognition, Image and JSON Proc...
PPTX
Obevo Javasig.pptx
PPT
Project seminar
PDF
Logic synthesis with synopsys design compiler
PPTX
Large scale, interactive ad-hoc queries over different datastores with Apache...
PDF
Structuring Spark: DataFrames, Datasets, and Streaming by Michael Armbrust
PPTX
Productionalizing ML : Real Experience
PPT
Dotnetintroduce 100324201546-phpapp02
PPT
ora_sothea
PDF
Synchronize AD and OpenLDAP with LSC
PPT
Leveraging Open Source to Manage SAN Performance
PPT
Cics application programming - session 2
Informatica slides
Oracle RI ETL process overview.
OWB11gR2 - Extending ETL
SQL Performance Tuning and New Features in Oracle 19c
User Group3009
Perfsystems- Consulting Services
Advanced SQL - Database Access from Programming Languages
Google cloud Dataflow & Apache Flink
U-SQL Killer Scenarios: Custom Processing, Big Cognition, Image and JSON Proc...
Obevo Javasig.pptx
Project seminar
Logic synthesis with synopsys design compiler
Large scale, interactive ad-hoc queries over different datastores with Apache...
Structuring Spark: DataFrames, Datasets, and Streaming by Michael Armbrust
Productionalizing ML : Real Experience
Dotnetintroduce 100324201546-phpapp02
ora_sothea
Synchronize AD and OpenLDAP with LSC
Leveraging Open Source to Manage SAN Performance
Cics application programming - session 2

Recently uploaded (20)

PDF
Supply Chain Operations Speaking Notes -ICLT Program
PDF
01-Introduction-to-Information-Management.pdf
PDF
Updated Idioms and Phrasal Verbs in English subject
PDF
STATICS OF THE RIGID BODIES Hibbelers.pdf
PDF
Complications of Minimal Access Surgery at WLH
PDF
Yogi Goddess Pres Conference Studio Updates
PDF
RMMM.pdf make it easy to upload and study
PPTX
UV-Visible spectroscopy..pptx UV-Visible Spectroscopy – Electronic Transition...
PDF
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf
PPTX
Orientation - ARALprogram of Deped to the Parents.pptx
PDF
RTP_AR_KS1_Tutor's Guide_English [FOR REPRODUCTION].pdf
PDF
Chinmaya Tiranga quiz Grand Finale.pdf
PDF
Computing-Curriculum for Schools in Ghana
PDF
GENETICS IN BIOLOGY IN SECONDARY LEVEL FORM 3
PDF
Trump Administration's workforce development strategy
PDF
ChatGPT for Dummies - Pam Baker Ccesa007.pdf
PPTX
Microbial diseases, their pathogenesis and prophylaxis
PDF
Microbial disease of the cardiovascular and lymphatic systems
PDF
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
PPTX
Cell Types and Its function , kingdom of life
Supply Chain Operations Speaking Notes -ICLT Program
01-Introduction-to-Information-Management.pdf
Updated Idioms and Phrasal Verbs in English subject
STATICS OF THE RIGID BODIES Hibbelers.pdf
Complications of Minimal Access Surgery at WLH
Yogi Goddess Pres Conference Studio Updates
RMMM.pdf make it easy to upload and study
UV-Visible spectroscopy..pptx UV-Visible Spectroscopy – Electronic Transition...
OBE - B.A.(HON'S) IN INTERIOR ARCHITECTURE -Ar.MOHIUDDIN.pdf
Orientation - ARALprogram of Deped to the Parents.pptx
RTP_AR_KS1_Tutor's Guide_English [FOR REPRODUCTION].pdf
Chinmaya Tiranga quiz Grand Finale.pdf
Computing-Curriculum for Schools in Ghana
GENETICS IN BIOLOGY IN SECONDARY LEVEL FORM 3
Trump Administration's workforce development strategy
ChatGPT for Dummies - Pam Baker Ccesa007.pdf
Microbial diseases, their pathogenesis and prophylaxis
Microbial disease of the cardiovascular and lymphatic systems
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
Cell Types and Its function , kingdom of life

Mainframe Technology Overview

  • 2. Mainframe Technology Overview BluePhoenix Mainframe Architecture C-Scan Toolbox and IT Discovery Repository Files IT Discovery Database and Query Facility
  • 3. BluePhoenix Mainframe Architecture Integration of the following: Methodology – Designed to solve the problem systematically with little disruption to the client. C-Scan – Conversion engine with its own interpreted language components and exits. ISPF Panels and REXX – Workflow engine that enforces the methodology and controls processing (batch and online). Mini-Scheduler – Controls batch job flows. Toolbox – JCL and programs in C-Scan and REXX for analysis and conversion. Repository – Set of detailed tables, describing the environment and containing rules for conversion.
  • 4. BluePhoenix Mainframe Architecture Inventory Reports Generation of Tools Conversion Unit Test System Test Implementation Repository Enhancement Repository Files Specific Cluster Metadata Libraries Generation Parameters Libraries Converted Libraries Implementation Parameters
  • 5. C-Scan Assembler Interpreter – Provides some of the assembler abilities, such as dynamic file allocation and edit of PDS directory. Event Driven Engine – Driven by record-by-record automated reading, with logic controlled by events or keys. Writes Records – Creates records with implicit and explicit location setting of fields; is able to write and execute its own code. Flexible – Has COBOL-like structures for multiple file processing; Record Merge; Subprogram. Easy to Learn – Simple and consistent syntax.
  • 6. The Toolbox Global Assessment/Impact Analysis – Performs inventory definition and analysis for migration or field adjustment projects. FieldEnabler – Performs conversions of field lengths and formats automatically while protecting the corporate knowledge asset. COBOL/LE-Enabler – Performs COBOL/LE conversions. DBMSMigrator – Performs migration from non-relational to relational databases and from their access language to SQL (DBMSMigrator is a PC-based tool).
  • 7. Converting Programs, Files, Control Statements, and JCL Only C-Scan produced C-Scan scripts are used. The toolbox is self-documenting and controlled, and comments converted source.
  • 8. Global Assessment and Field Adjustment Global Assessment IT Discovery Product made up of C-Scan programs (and others) used for the static understanding of the application components within a full system environment. Field Adjustment The Toolbox Tools made up of C-Scan programs (and others) used for the analysis and eventual conversion of specific system components through dynamic propagation of fields tracking data within and between programs. Tools and Products C-Scan
  • 12. Some Repository Files DBJCL DBDSN01 DBSOURCE DBDTSCAN DBPGROUT DBLOAD DBTRAN DBCALL INVLIST
  • 13. Mainframe Technology Overview BluePhoenix Mainframe Architecture C-Scan Toolbox and IT Discovery Repository Files IT Discovery Database and Query Facility
  • 14. Entry-points (Events) INIT - Before the first record of the input. TERM - After the end of the input file. INITMEM - Before reading the first record of a member of a library. TERMMEM - After the last record was read from a member of a library. OUTREC - After reading each record. HEADER - At the beginning of a page when generating reports. TRAILER - At the end of a page when generating reports. KEYS - When the defined keys are changed.
  • 15. Power of C-Scan Some of the Available Variables $$WORK Work area for that C-Scan step $$GLOB Work area that bridges multiple C-Scan steps $$RECORD The input record $$OUTREC The output record $$DSN The DSN on the input DD statement - PDS or SEQ $$MEMBER The member name within a PDS $$PNVMEM The member name of a PANVALET data set $$VOLSER The VOLSER of the input dataset $$DATE The current date in installation format $$DATER The current date in format “mmm, dd yyyy” $$TIME The current time (HH:MM:SS) $$JOB The name of the job specified on the JOB statement $$STEP The name of the step from the EXEC statement $$UID The UserID that last updated the member
  • 16. Power of C-Scan INDEX Function Searches for a string in the specified part of the source record, can use specialized wildcards. Returns 0 to indicate false (string not found) or position of the found string. Creates the following variables: $$IDXA The text specified by the INDEX function. $$IDXB The text following the text that was found. $$IDXC The text preceding the text that was found.
  • 17. Power of C-Scan Some Tables Used to delimitate fields (set pointers) Continue as long as… ALU Uppercase Alphabetic ALN Alphanumeric (Uppercase) NUM Number DSN Characters acceptable for dataset names COB Characters acceptable for COBOL field names N… Continue as long as it is not … Also specified characters and wildcards
  • 18. Power of C-Scan Use of Index, Tables, and Wildcards From a source library, create a list of fields with a PIC of X(50), along with the member name //SYSCIN DD * S010_RPT BUILD,DD1=IN010,DD2=OUT010,PARMDD=P010, D=|#+?|,SORTO='(1,8,CH,A,10,32,CH,A)' //P010 DD * OUTREC ( IF (INDEX(6,72,' ?PIC ?X(50)'),EQ,0) EXIT(LEAVEREC) IF ($$IDXC(1,COB,32),EQ,'FILLER') EXIT(LEAVEREC) BUILD($$MEMBER,10:$$IDXC(1,COB,32),/) )
  • 20. C-Scan in Production – Control Statements
  • 21. C-Scan in Production – Control Statements
  • 22. C-Scan in Production – XTBLDREC
  • 23. Mainframe Technology Overview 1. BluePhoenix Mainframe Architecture 2. C-Scan 3. Toolbox and IT Discovery 4. Repository Files 5. IT Discovery Database and Query Facility
  • 24. The Mainframe Tool Libraries JOB (JFIX; JUSER) – JCL and Mini-Scheduler lists PARM (PFIX; PUSER) – Programs and control tables written in C-Scan REXX – Mini-Scheduler and Screens JCLFIX – JCL templates for situation specific jobs
  • 25. Controlling the Flow – Screens
  • 26. Controlling The Flow – Batch: The Mini-Scheduler All jobs are submitted with the same JOBNAME and JOBCLASS Jobs have two steps added, STEP00 as the first step and STEPFF as the last: STEPFF writes to the LOG that the JOB successfully completed (based on Condition Code less than 5) STEP00 causes a failure (System 806-4) if the previous JOB’s STEPFF did not write to the LOG (based on Run Type) This permits: High Level of Automation High Number of Jobs – easy to debug
  • 27. What We Need to Know About a Program Language Files Other programs with same files JCL (or OLTP) used Routines called Routines calling Copies Loads DBMS structure Etc.
  • 28. What We Need to Know
  • 29. What We Need to Know: Collect JCL with PROCs Source JCL ITD External Reader OPJCL PROCLIB ITD DBJCL
  • 30. Cross Referencing Components Cross-referencing loads, JCL, CICS, and source Identifying “missing” entities – sources not run, JCL or TRANS without source There is a turnaround screen to link load and source where names differ
  • 31. Multi-Language Interface Meta-COBOL #SRV22 JCL; PROCs; PARMs; CSD; DBMS; Load Repository Reports Date Fields xxxx sssss xxxx sssss xxxx sssss Records xxxx sssss xxxx sssss xxxx sssss PL/I Easytrieve Source and Copy COBOL Master Repository Inventory Assembler Other (4 th GL)
  • 32. Revise Commands, Conflicts, and Fixes Screen
  • 33. IT Discovery Provides understanding of the mainframe operational and application environments, their inventory and interrelationships. Accesses, analyses, cross-references software components on an MVS mainframe. Makes sense of the spaghetti that is an MVS application environment. Finds redundancies, missing entities, and inefficient interconnections. Creates a DB2 database on the components reportable through SQL. Includes a network based query facility (QF).
  • 34. BluePhoenix IT Discovery Static Information Collection OLTP and Batch SOURCE and COPIES IT Discovery Repository Flat Files JCL; PROC; Control Statements LOAD Modules CICS Tables (CSD) IMS/DC Database Definitions DB2; IMS; IDMS Job Schedulers IT Discovery Repository DB2 Reports Queries Network Query Facility Web Queries
  • 35. Process Stages Output Process Stage Input DB2 Repository Process Relational Database 4 Database parameter definitions Repository Files Updated Repository Reports Turnaround: Resolve variables containing routine names Resolve variables containing DD names Analyze Source Entities 3 Production source programs and copy libraries Repository Files Populated Repository Reports Turnaround: Resolve Load without corresponding source Analyze Inventory Components 2 Production source programs and copy libraries Production JCL, PROC and Control statement libraries Production Load libraries Production OLTP control data Production DB formats Customized environment Empty Repository datasets Build Environment 1 Product libraries Customer parameters
  • 36. Analyze Source Entities This step does the in-depth static analysis of the source programs. ITD builds a Repository file that lists and details entities and relationships from within COBOL programs to the “outside world” including flat files, VSAM, DB2, IMS, IDMS and routines. Other languages are analyzed without routine calls. The Repository file includes detailed information on the interfaces between routines and between programs and data.
  • 37. Analyze Source Entities Turnaround Screens “Variable Routines” Where a routine is called using a variable field for the name, and that variable is not defined as a constant in the Data Division. “Variable DDNAMEs” Where a DDNAME is a variable field, and that variable is not defined as a constant in the Data Division.
  • 38. Query Facility Architecture IT Discovery Repository (DB2) QF Server QF Client QF Client QF Client QF Client
  • 39. ITD – Incremental Running IT Discovery Collection Scripts Control Datasets Catalog IT Discovery Repository C-Scan Engine DB Definitions Changes, Additions and Deletions ONLY Libraries
  • 40. Mainframe Technology Overview BluePhoenix Mainframe Architecture C-Scan Toolbox and IT Discovery Repository Files IT Discovery Database and Query Facility
  • 41. DBSOURCE Dataset (1 of 4) DBSOURCE
  • 42. DBSOURCE Dataset (2 of 4) DBSOURCE
  • 43. DBSOURCE Dataset (3 of 4) DBSOURCE
  • 44. DBSOURCE Dataset (4 of 4) DBSOURCE
  • 45. IT Discovery – DB2 Repository
  • 46. IT Discovery – DB2 Repository
  • 49. Mainframe Technology Overview BluePhoenix Mainframe Architecture C-Scan Toolbox and IT Discovery Repository Files IT Discovery Database and Query Facility
  • 50. Query Facility Architecture Thin Client – Users have no software installed and access the server via a browser (intranet) Server – Operates software that: Provides screens and interfaces to user Manages users, queries, and access to queries Passes SQL queries from clients to mainframe (via DB2Connect) and returns results Mainframe – Contains ITD Relational Database (DB2)
  • 51. Query Facility Users Administrator – Connects repositories; maintains Users, Public, and Shared query folders. Power User – Can create and run own queries in Private folder. Can copy to Shared folder. Can run queries from Public and Shared, and copy to Private. Regular User – Can only run queries from Public folder.
  • 52. Query Facility Folders Public – Categories and queries available; “Read Only” to everyone; maintained by the Administrator. Shared – Categories and queries available; “Read, Write, Not Update” to Power Users (if enabled) and the Administrator. Private – Queries only available; “Read, Write, and Update” to Power Users and Administrator.
  • 53. Query Facility – Categories and Queries Categories – Groupings of Queries Categories belong in Folders Queries – Use standard DB2 SQL Queries can have drill down menus from specific results to other nested queries
  • 54. Query Facility – Categories and Queries
  • 56. Query Facility – Running and Drilling Queries are run by hitting EXECUTE Parameter is per SQL – avoid using only % Drill down menus are available on underlined results Nested Queries available through the drill down menus
  • 57. Query Facility – Running Queries
  • 58. Query Facility – Drilling Down
  • 59. Query Facility – Nested Query
  • 60. Query Facility – Filters and Export Results columns can be filtered for viewing Results can be exported to XML; CSV; Excel; HTML
  • 61. Query Facility – Filtering Results
  • 62. Query Facility – Exporting Results
  • 63. Query Facility – Exporting Results
  • 64. Query Facility – Creating Queries Creation by Power Users and Administrators Must be in Category Can create drill-down menus Can link drill-down menus
  • 65. Query Facility – Query Creation
  • 66. Query Facility – Drill-Down Linking
  • 67. Query Facility – Online Documentation
  • 68. Query Facility – Online Documentation

Editor's Notes

  • #2: Notes:
  • #3: Methodology is SUN AM C-Scan language bits and bytes is SUN PM The major special purpose “exit” is MON AM; this includes exercise time The special purpose files that we use to analyze and convert systems, with their corresponding reports is MON PM Other special purpose “exits” is TUES AM; this includes exercise time, which will extend into the afternoon A walk through the conversion process including a case study you will do on your own is WED and THURS AM The Summary is THURS PM No homework, but you’ll find that reading through the reference material and reviewing your class notes will be helpful.
  • #4: We sell service, not technology It is the combination of our people, our methodology and our tools that we provide that delivers our service
  • #5: Describe the flow process of all 6 phases A questionnaire covers the management, development and maintenance processes, the existing systems and applications, the hardware, data and system software, the current practices, and IS principles. A survey and assessment is completed to identify the Year 2000 date affected components (software, hardware, procedures, databases, etc.). Clusters, data bridges and interfaces are also identified for conversion. The conversion process includes multiple phases in an iterative process, fine-tuning the conversion controls until compilable code is achieved. Intra-cluster tests (unit and regression) are performed. Intra-cluster tests are completed on all components within the cluster. Acceptance testing is a client-driven testing process that demonstrates the application’s ability to meet the acceptance criteria previously defined with the client.
  • #8: Benefits: Maximize automation of the conversion process. Minimize interference with Production Systems Maintenance. Minimize code freezing period. Progressive conversion of logically linked subsystems (clusters). Conversion process transparent to end user. Capabilities: Produces a database of the organization’s software inventory. Produces a database of all date fields and their cross references. Provides a large range of software inventory cross reference reports. Provides computerized templates that control the conversion process. Provides automated update capabilities that support db and application logic changes. Toolbox automated analysis, conversion, management, and control services simplifies the conversion process, significantly minimizing risks usually involved with such a large project. Consistently automates conversions minimizing human mistakes that are difficult to detect. Enables the correction of errors by applying changes across multiple applications. Develops control parameters to use as input to perform the actual conversion. Changes to program do not affect the performance of conversion. Customer retains control over source code, thereby minimizing freeze time.
  • #9: Benefits: Maximize automation of the conversion process. Minimize interference with Production Systems Maintenance. Minimize code freezing period. Progressive conversion of logically linked subsystems (clusters). Conversion process transparent to end user. Capabilities: Produces a database of the organization’s software inventory. Produces a database of all date fields and their cross references. Provides a large range of software inventory cross reference reports. Provides computerized templates that control the conversion process. Provides automated update capabilities that support db and application logic changes. Toolbox automated analysis, conversion, management, and control services simplifies the conversion process, significantly minimizing risks usually involved with such a large project. Consistently automates conversions minimizing human mistakes that are difficult to detect. Enables the correction of errors by applying changes across multiple applications. Develops control parameters to use as input to perform the actual conversion. Changes to program do not affect the performance of conversion. Customer retains control over source code, thereby minimizing freeze time.
  • #13: The first step of scanning the environment is to create a dataset of recent log activities. Depending on the client’s archiving procedures, up to 15 months of the system log files may be processed. The purpose is to identify the active jobs, programs and on-line transactions in order to reduce the conversion repository to only active components.. In this step the DBLOG dataset is created out of SMF and other monitor log files, such as Tmon, Omegamon, etc. The results are a list of jobs, programs and transactions including statistics of how often each has been used. The jobs should be tailored according to the site’s archiving procedure and system monitor type. In the case of a third party’s log manager such as MXG, a job should be tailored according to their reports. The based assumption is that these reports were verified by the client and found to be accurate.
  • #14: Methodology is SUN AM C-Scan language bits and bytes is SUN PM The major special purpose “exit” is MON AM; this includes exercise time The special purpose files that we use to analyze and convert systems, with their corresponding reports is MON PM Other special purpose “exits” is TUES AM; this includes exercise time, which will extend into the afternoon A Walk Through the conversion process including a case study you will do on your own is WED and THURS AM The Summary is THURS PM No homework, but you’ll find that reading through the reference material and reviewing your class notes will be helpful.
  • #15: INIT - Example: INIT (BUILDW(10000' ')) to initialize work area with blanks. TERM - to calculate and write summary values of fields. INITMEM - example: INITMEM(BUILD(1:$$MEMBER)) to save member name. TERMMEM - Same as TERM but the summaries are per member. PROCESS - OUTREC - process a record from input. Give example on the board!!! APPEND - write example of a report!!! HEADER - create information for page header lines. TRAILER - Same as above for trailer lines. KEYS - example: KEYS(1,8,HEADER('DETAILS FOR':1,8,/))
  • #16: Examples should be prepared for those variables that need them.
  • #24: Methodology is SUN AM C-Scan language bits and bytes is SUN PM The major special purpose “exit” is MON AM; this includes exercise time The special purpose files that we use to analyze and convert systems, with their corresponding reports is MON PM Other special purpose “exits” is TUES AM; this includes exercise time, which will extend into the afternoon A Walk Through the conversion process including a case study you will do on your own is WED and THURS AM The summary is THURS PM No homework, but you’ll find that reading through the reference material and reviewing your class notes will be helpful.
  • #28: Survey questionnaires are distributed to the client: Site Preliminary Questionnaire One of the first contacts with the client will be to initiate a request for general information about the client’s environment. This will be done through a site Preliminary Questionnaire. The purpose of the questionnaire is to understand the complexity and quantities of the most important components of the client’s environment. The questionnaire also identifies the main programming languages, the main DBMSs, system monitors, and naming conventions. Use the document to identify components not yet supported by the Toolbox. System Questionnaires Each site will have many applications, projects or systems being processed. A System Questionnaire should be distributed to each client resource responsible for an application, project or system. The purpose of the questionnaire is to collect information about every active application system or project that exists on site and is valid for the conversion process. This will be done by obtaining the naming conventions per system and a list of libraries, files, databases and I/O modules. It is needed for setting up the tools for performing the survey and for clustering the systems. The information from the system questionnaires will also be used for learning the client’s environment. The information will be compared with the actual components found through scanning the environment using IT Discovery. Reports are included with IT Discovery for reporting discrepancies in the client information.
  • #30: This illustrates the process flow during Survey and Assessment using IT Discovery.
  • #31: Properties of Objects Each object in AppBuilder has a set of properties or attributes that describe it. Some properties are common to all object types (for example Name and System Id), but for the most part, each object type has a different set of properties. In other words, where a Field might have a data format and length, a Rule would not; it would have an execution environment. At this early stage of your AppBuilder learning, you are not expected to know all the properties of all the object types you will encounter, they will become apparent as you progress through the course. All you need to know at the moment is that when you create an object the properties are set to default values, which you may well have to change. To see the properties for any object, press Alt + Enter from the hierarchy diagram.
  • #32: The goal of this process is to create a relationship between a program and date fields. The process is primarily automated through IT Discovery jobs and includes the following steps: Repository Files Merged This process involves merging the DBDTSCAN files. The language oriented DBDTSCAN files will be merged into a common global repository file. Repository Files Populated The final process of building the repository involves populating the repository file. In this step the unnormalized DBDTSCAN dataset is exploded to many additional dataset such as DBCOPY, DBCALL, DBCALLED, etc. Duplicate information is deleted and information about the same entity is merged from various records. Repository Transactions Completed This process will complete the transaction information for the repository. The DBDTSCAN repository includes information about relationships between programs and transactions. This information is merged into DBTRAN. Up to this point in the process, the transaction repository includes information about the relations between transactions and programs. The objective of this step is to complete the information regarding the relations between on-line programs and files (DDnames DSnames or DBnames).
  • #36: What is an Object? All objects have the following five properties: General Properties Audit - who, when, where etc... Remote Audit - when created on the Enterprise repository. Text - description of the object Keywords - help with searching for objects Some objects such as RULES also have source code associated with them.
  • #41: Methodology is SUN AM C-Scan language bits and bytes is SUN PM The major special purpose “exit” is MON AM; this includes exercise time The special purpose files that we use to analyze and convert systems, with their corresponding reports is MON PM Other special purpose “exits” is TUES AM; this includes exercise time, which will extend into the afternoon A Walk Through the conversion process including a case study you will do on your own is WED and THURS AM The summary is THURS PM No homework, but you’ll find that reading through the reference material and reviewing your class notes will be helpful.
  • #42: The first step of scanning the environment is to create a dataset of recent log activities. Depending on the client’s archiving procedures, up to 15 months of the system log files may be processed. The purpose is to identify the active jobs, programs and on-line transactions in order to reduce the conversion repository to only active components.. In this step the DBLOG dataset is created out of SMF and other monitor log files, such as Tmon, Omegamon, etc. The results are a list of jobs, programs and transactions including statistics of how often each has been used. The jobs should be tailored according to the site’s archiving procedure and system monitor type. In the case of a third party’s log manager such as MXG, a job should be tailored according to their reports. The based assumption is that these reports were verified by the client and found to be accurate.
  • #43: The first step of scanning the environment is to create a dataset of recent log activities. Depending on the client’s archiving procedures, up to 15 months of the system log files may be processed. The purpose is to identify the active jobs, programs and on-line transactions in order to reduce the conversion repository to only active components.. In this step the DBLOG dataset is created out of SMF and other monitor log files, such as Tmon, Omegamon, etc. The results are a list of jobs, programs and transactions including statistics of how often each has been used. The jobs should be tailored according to the site’s archiving procedure and system monitor type. In the case of a third party’s log manager such as MXG, a job should be tailored according to their reports. The based assumption is that these reports were verified by the client and found to be accurate.
  • #44: The first step of scanning the environment is to create a dataset of recent log activities. Depending on the client’s archiving procedures, up to 15 months of the system log files may be processed. The purpose is to identify the active jobs, programs and on-line transactions in order to reduce the conversion repository to only active components.. In this step the DBLOG dataset is created out of SMF and other monitor log files, such as Tmon, Omegamon, etc. The results are a list of jobs, programs and transactions including statistics of how often each has been used. The jobs should be tailored according to the site’s archiving procedure and system monitor type. In the case of a third party’s log manager such as MXG, a job should be tailored according to their reports. The based assumption is that these reports were verified by the client and found to be accurate.
  • #45: The first step of scanning the environment is to create a dataset of recent log activities. Depending on the client’s archiving procedures, up to 15 months of the system log files may be processed. The purpose is to identify the active jobs, programs and on-line transactions in order to reduce the conversion repository to only active components.. In this step the DBLOG dataset is created out of SMF and other monitor log files, such as Tmon, Omegamon, etc. The results are a list of jobs, programs and transactions including statistics of how often each has been used. The jobs should be tailored according to the site’s archiving procedure and system monitor type. In the case of a third party’s log manager such as MXG, a job should be tailored according to their reports. The based assumption is that these reports were verified by the client and found to be accurate.
  • #48: The first step of scanning the environment is to create a dataset of recent log activities. Depending on the client’s archiving procedures, up to 15 months of the system log files may be processed. The purpose is to identify the active jobs, programs and on-line transactions in order to reduce the conversion repository to only active components.. In this step the DBLOG dataset is created out of SMF and other monitor log files, such as Tmon, Omegamon, etc. The results are a list of jobs, programs and transactions including statistics of how often each has been used. The jobs should be tailored according to the site’s archiving procedure and system monitor type. In the case of a third party’s log manager such as MXG, a job should be tailored according to their reports. The based assumption is that these reports were verified by the client and found to be accurate.
  • #49: The first step of scanning the environment is to create a dataset of recent log activities. Depending on the client’s archiving procedures, up to 15 months of the system log files may be processed. The purpose is to identify the active jobs, programs and on-line transactions in order to reduce the conversion repository to only active components. In this step the DBLOG dataset is created out of SMF and other monitor log files, such as Tmon, Omegamon, etc. The results are a list of jobs, programs and transactions including statistics of how often each has been used. The jobs should be tailored according to the site’s archiving procedure and system monitor type. In the case of a third party’s log manager such as MXG, a job should be tailored according to their reports. The based assumption is that these reports were verified by the client and found to be accurate.
  • #50: Methodology is SUN AM C-Scan language bits and bytes is SUN PM The major special purpose “exit” is MON AM; this includes exercise time The special purpose files that we use to analyze and convert systems, with their corresponding reports is MON PM Other special purpose “exits” is TUES AM; this includes exercise time, which will extend into the afternoon A Walk Through the conversion process including a case study you will do on your own is WED and THURS AM The Summary is THURS PM No homework, but you’ll find that reading through the reference material and reviewing your class notes will be helpful.