Data-driven materials science for industry
We have analyzed terabytes of data. We compute on modern high-performance clusters and in the cloud (GCP, Azure, AWS).
Take advantage of our expertise in technical computing to solve your challenges with artificial intelligence and data science to reveal novel patterns in your data.
The wealth of experimental data enables novel ways of solving problems. Data-hungry algorithms like machine learning constitute a paradigm shift in engineering. They make the invisible visible.
Raise the Data Treasure
Manual image data analysis can be very time-consuming or complex. A sound interpretation requires in-depth knowledge of materials. Our experience in data science enables automated and quantitative analyses of images. A state-of-the-art visualization makes the interpretation simple und understandable.
Predict the Future
Many engineering challenges can be formulated as classification or regression tasks. These methods allow process optimizations, predictions of results or rapid OK-NOK evaluations. Such predictions lead to novel solutions and accelerate development.
Technical Skill Set
- Python, Matlab (Octave)
- Experience in High-Performance Computing
- Experienced working in UNIX environments
- Various material simulation codes
Machine Learning Skills
- Tensorflow, OpenCV, Keras, scikit-learn
- Object detection (e.g., mask R-CNN)
- (Variational) autoencoder
- CNN (1D, 2D, 3D)
- SVM, random forest
- K-means clustering
- Time series analysis, RNN
- Numpy, Pandas
Data Visualization & Reporting
- Volume Graphics VG STUDIO MAX
- Paraview, Visualization Toolkit (VTK)
- Jupyter, Matplotlib, Plotly, Streamlit, Gnuplot
- Experienced with GCP, Azure, and AWS
- GitHub, GitLab
- NoSQL DBs and tools
- SLURM, Azure Cycle Cloud
- Experience in managing scientific compute environments
Technical Computing at Work
3D Data Visualization
Analyzing large 3D datasets requires the use of advanced visualization software packages. We are experts in common 3D suites such as Volume Graphics or ParaView. We develop our own post-processing algorithms to quantify material properties or physical observables. This enables the improvement of designs and processes in development and production.
We want to bring data to life. We use cutting-edge data representation packages to give users an interactive experience and let them explore their data themselves. We believe that having the data at your own fingertips leads to a deeper understanding and faster solution to engineering challenges.
Basic GUIs for AI & Data Science
We are aware that low-code environments are required in daily operations. With simple graphical environments, we enable non-experts to run complex machine learning algorithms with a few clicks. We bring the entire AI machinery to your business - containerized and maintainable.
We write custom Python analyses in a matter of hours - an effective complement to industry-standard evaluation methods using spreadsheets. For high-throughput applications, we use massively parallel environments in C++. We do this in our self-maintained Linux environments.
We have been using machine learning for years. Implementations range from chemical reaction mechanisms to physical properties of condensed matter. Our AI toolbox includes both classical shallow learning techniques and more data-hungry deep learning methods, e.g., TensorFlow or similar libraries. These techniques are ideally suited for computer vision applications, such as those used in industrial computed tomography.
Data Science requires profound statistical knowledge and experience in analyzing large datasets. With our many years of scientific and technical experience, we can help you derive quantifiable performance metrics, quality attributes, or correlations of variables or parameters. We often start with a blank sheet and an idea, which is then translated into code.
We are proficient in setting up high-performance compute environments on Google Cloud Platform (GCP), Microsoft Azure, and Amazon Web Services (AWS). This gives access to highly scalable GPU training environments. For compute-intensive tasks, we have extensive experience in setting up high-performance compute clusters.