Data-driven organizations need low-friction, highly available and fast access to mainframe data as input for distributed applications, microservices and other business processes. Yet, access to this data has historically meant overcoming multiple obstacles, leading to project delays, cost overruns and compromises in speed, security and flexibility.
Confluent and Luminex will discuss how to use Apache Kafka and Mainframe Data Integration (MDI) to build efficient, highly available and agile mainframe data pipelines with an event streaming platform that can be made available to unlimited applications - without the struggles or compromises of the past. Hitachi Vantara will demonstrate how Pentaho enables enterprises to leverage these secure, high-speed event streams for better operational insights and business operations.
Data Analysts, Enterprise Architects, Application Developers and anyone else who provides or relies on access to mainframe data can benefit from this modern approach to mainframe data integration.
Art Tolsma, CEO, Luminex
Jeff Bean, Partner Solutions Architect, Confluent
Derek Wilson, Digital Solutions Engineer, Hitachi Vantara
This webinar is sponsored by Luminex and does not necessarily represent the view of SHARE and/or SHARE members.
By registering, you are providing SHARE Association (“SHARE”) with your express consent for SHARE to use your personal data as follows:
-SHARE uses the data you provide to SHARE to service your membership, provide you with information about services, upcoming conferences and events and such other purposes which are within the scope of SHARE’s exempt purpose and mission; and for such other purpose as SHARE may approve from time to time.