Skip to main content

Webinar: Streaming & dynamic modeling with Hadoop - much more than just big data

 

Data, data, data, data – microservices are a reality, data is being generated at an increasingly rapid rate, and all data must be continuously analyzed – preferably in real time. This webinar will take you step by step through all the logical components of the architecture and explain the latest trends and solutions from the Hadoop Ecosystem – including, of course, Apache Kafka, Spark, Avro and many more.

Are big data and streaming on your agenda for 2018? Then visit our online webinar on February 22, 2018 from 11 am to 12 pm with our speakers Frank Schmidt and Christian Schroer!
 

Content of the webinar

  • Architecture overview – How the data is processed from start to finish.
  • Ingestion – How the data is moved in real time and how data serialization formats can help.
  • Processing – The easiest and most efficient ways of dealing with streaming and mass data.
  • Persistence – The benefits and drawbacks of long-term archiving.
  • Access - How business analysts, data scientists and others can work with the data.
     

Who is the webinar aimed at?

The webinar will interest all those who would like to see step-by-step insights into the development of a modern data platform architecture. There will also be practical tips on how to build a reliable, adaptive and efficient data pipeline using Kafka and Hadoop technologies. Participants will learn how to keep track of agile IT infrastructures with tightly controlled development cycles and constant schema changes, and how to be sure that their data platform always has the latest data.
 

Webinar speakers

Frank Schmidt heads up b.telligent's Competence Center Big Data & DevOps. He is a business economics graduate with over twelve years of project experience in telecommunications, e-commerce, internet and the media. His extensive project experience has given him an excellent understanding of strategy and conceptual planning for business intelligence, big data and streaming architectures.

Christian Schroer is principal consultant at b.intelligent's Competence Center Big Data & DevOps. He has over six years of project experience with big data technologies and the Hadoop Ecosystem. During the past two years, he has specialized in building modern big data and streaming platforms for innovative companies in the internet sector and in industry and commerce.

 

Click here to register!


We look forward to your participation!

If you are also interested in other webinars in our "BI now – into the digital age with big data, analytics & co." series, you can find out more about upcoming webinars here.

More about b.telligent