BIK 2018 highlights - agile product development at car2go thanks to microservices and streaming data
In his lecture at the BI Congress to be held by b.telligent on 3rd May 2018 in Munich, Marc Lenz, project manager for big data at the car2go group will reveal the added value his team obtains from microservices. In an interview beforehand, he describes why streaming data are so important for analysis and product development.
car2go is global market leader in fully flexible car sharing. Belonging to car manufacturer Daimler, the business is offered worldwide in 26 cities in a total of eight countries. In Germany, car2go is active in seven cities with over 3,900 vehicles, and has almost 900,000 registered users. Over three million users are registered worldwide.
Holding a diploma in information technology, Marc Lenz heads the development of the business intelligence system landscape at car2go. He is responsible for implementation and operation of software systems, and designs data, software and system architectures as well as governance models which ensure enterprise-wide data consistency besides allowing agile development of data-driven products.
Fundamental transformation brought about by product development at car2go is turning systems and data structures from central monoliths into distributed microservices. Marc Lenz shows how business intelligence responds to these changes, and how transformation of BI architecture provides true added value throughout the enterprise. In this process, he explains how an appropriate data and software architecture makes product development more agile and enables creation of data-driven services across divisions.
What exactly is behind the transformation at car2go product development, and which consequences does this have for the mentioned systems and data structures?
Marc Lenz: Our goal is to utilize big data analytics and the system not only for purely analytical purposes. Data management as well as the data themselves are to serve product development in greater measure, and also raise awareness for such data in the domain. For us in practice, this means that we establish the big-data system and the streaming-data component as a central data hub and product development platform, so as to create service-to-service communications in order to develop improved digital products. At the same time, the platform serves as a basis for all future analytical data-science applications. We would like to combine all this and get rid of one-way streets.
Optimized relocation is a concrete application for us. Different degrees of utilization prevail in our areas of business. Based as far as possible on forecasts, we try to relocate cars from areas with potentially lower demand at certain times of day to other areas where demand is higher, for example, from a city's outskirts to its centre. We underpin these decisions algorithmically and mathematically, using appropriate data as evidence. For this purpose, our data scientists have designed a model allowing highly optimized recognition of demand in advance while taking into account all possible constellations.
If our data scientists can calculate their models on the platform and build prototypes on which the final product already runs, this significantly shortens the time-to-market. This generates distinct added value for us because we can directly commence production and there are no longer any significant media discontinuities. As a result, we are able to develop data-driven services and products more quickly in greater numbers, and fill gaps better through faster delivery.
Appropriate organizational anchorage is important with regard to data quality. The classic lifecycle usually begins with a creative workshop and an examination of which sources to obtain data from in order to analyze demand in advance, for example. Technical hurdles often arise in this process, because certain data sources are either unavailable, or data are not available in the form in which the data scientists need them. Important variables include, for example, demand data indicating times at which customers seek vehicles.
Our goal is to create a system and organizational landscape in which data are understood, handled and maintained such that they can be viewed as production resources which undergo correspondingly continuous management, quality testing and quantification in terms of their value. In the specific example involving relocation, demand data have a certain value for the company, so that it is strategically useful for us to already keep the data in a ready state instead of performing re-evaluations for each use case. For smooth operations, it makes sense to assure the quality and legal compliance of data, and keep them available for direct use in this "data universe". Naturally, this transformation of our system landscape entails standardization of data and processes.
Business intelligence at car2go has evolved from a team of originally pure "consumers" loading and transforming data, and using this as a basis for building cubes. In future, the data engineers of the BI team will also be involved to a greater extent in product development in order to directly implement data quality and data engineering there.
In most cases, 80% of a project for developing a product or microservice consists of data preparation and transformation. This proportion of 80% can be reduced only if the product development department is involved in quality affairs - technically as well as organizationally - already from the outset.
b.telligent confronts us with the right questions. How do we achieve data privacy compliance? How does our data architecture influence software architecture, and how does this benefit product development? What type of organization influences processes to what extent? Which roles must be redefined and which new tasks need to be performed now on an ongoing basis?
Involved here is establishment of an enterprise data model allowing a stronger bond between technology and organization. The teams contain staff possessing a great deal of domain-related knowledge and a sufficient understanding of the business to allow assessments of data quality. In contrast, classic developer candidates can properly assess technical quality. Here we define new responsibilities and reinforce existent ones. One consequence of this should be definition of persons as product owners for data and responsible for implementing data stewardship as well as embracing data management with an in-depth business understanding and domain knowledge.
b.telligent was able to credibly give us the impression that they are not just technical experts on the subject, but understand the organizational dimension too. b.telligent helps us to also adapt the organization to transformation, and establish new structures by means of a data governance concept. This is essential.
b.telligent BI Kongress 2018
Donnerstag, 3. Mai 2018
Hilton Munich Park Hotel
Am Tucherpark 7
Alle weiteren Informationen zur Agenda, den Referenten und Sponsoren: