Customer care
+34 913 67 09 92
Follow us

Data Challenges in Chip Industry

Data management is becoming a significant new challenge for the chip industry, as well as a new opportunity, as the amount of data collected at every step of design through manufacturing continues to grow.

Increasing Complexity and New Demands

The rising complexity of designs and increasing demands for reliability and traceability exacerbate the problem. Highly customized, domain-specific designs, along with the growing focus on chiplets developed using different processes and new materials, increase data generation. EDA and verification tools now generate terabytes of data weekly or even daily.

While these data can provide valuable insights for improving designs and processes, managing the current volumes is an ongoing challenge. The industry must rethink well-proven methodologies and processes and invest in new tools and approaches. Additionally, incorporating AI and ML into design tools to identify anomalies and patterns in large data sets complicates the landscape further, as these tools are regularly updated with new algorithms and features.

Rob Knoth, product management director in Cadence’s Digital & Signoff group, highlighted that each company has its design flow and data management methodology, potentially leading to massive data volumes without clear usage.

Opportunities and Challenges in Data Management

According to Tony Chan Carusone, CTO at Alphawave Semi, the problem lies in organizations not designing scalable systems from the start. This presents an opportunity to leverage data in new ways, although it requires reorganizing systems and processes.

Jim Schultz, senior staff product manager at Synopsys, categorizes data management challenges into three buckets: determining which information is critical, managing data collection, and optimizing data usage. Identifying essential data for the design process and using industry-specific data analytics tools are key.

Data Security and Collaboration

Data security is another critical aspect. Simon Rance, director of product management at Keysight, emphasized that data security is as important as deciding which data to save. Collaboration among companies in different regions increases the need for balancing security and operational efficiency.

The complexity of chip design, involving hundreds of people and multiple companies, makes data security even more crucial. Gina Jacobs, head of global communications and brand marketing at Arteris, suggested that standardized, accessible data storage could solve many problems, allowing teams to access and modify information as needed.

Solutions and the Future of Data Management

The first step in addressing data overload is identifying essential data. Rance highlighted using smart algorithms to optimize storage and transfer times. Additionally, Schultz envisions a future where generative AI will touch every facet of chip development, allowing for quicker trend analysis and pattern recognition.

Despite advancements, comprehensive AI tools capable of complex analysis are still years away. Knoth of Cadence mentioned some clients’ reluctance to adopt these technologies due to costs associated with disk space, compute resources, and licenses.

Tony Chan Carusone suggested that hardware engineers adopt lessons from software development, such as organizing and automating data collection and developing methods to verify and test in parallel.


The influx of new tools for analyzing and testing chip designs has increased productivity but also brought additional considerations. Effective data management is essential to maximize its value. Powerful tools are already in place to help, and the AI revolution promises even more powerful resources to streamline testing and analysis.

For now, handling all that data remains a delicate balance. Effective communication and investment in specific project and chip management processes are crucial for success.