- Design Technology
- System Technology
- Validation/Test
Author/Speaker: Sravasti Nair, Product Engineering Group Director, Cadence Design Systems
Title: Keeping up with node scaling: evolution of EDA solutions in the era of Moore and beyond
Abstract: Electronic Design Automation (EDA) solutions have been evolving constantly in response to the continuous advances in Process Technology following Moore’s law over the past few decades. The advances in process technology have greatly increased IC design complexity not just for designers, but also for EDA solutions. The drive to reduce feature size beyond optical resolution of visible and ultra-violet light has led to the need for multiple masks/ patterns for same layer to allow for a more compact layout. Need for greater scaling and manufacturing accuracy has led to a self-aligned fabrication process requiring gridded, unidirectional interconnects. At the same time, ‘3D’ devices such as tri-gate finFETS followed by Nano-sheet or Ribbon FETs have been introduced to address power, leakage and variability associated with these processes while further reducing the layout footprint. In addition to more restrictive and complex design rules for manufacturability and process characteristics, EDA solutions need to account for changes in layout paradigm for placement, fill and routing, especially for analog/ full custom IC designs. This presentation will cover the sweeping changes in EDA tool to meet the process technology requirements of these nodes for both devices and interconnects, with focus on physical implementation. It will also touch upon the tool changes introduced to account for the increasingly significant parasitic and electrical effects that require up-front analysis for a correct by construction layout to minimize costly iterations on these nodes.
Author/ Speaker: Aditi Priya, ASIC Design Engineer, Synopsys
Co-Authors: Tanvi Agrawal, Manu Saxena
Title: Early-Stage RTL Power Estimation and Exploration
Abstract: Power management strategies, which were once used for a few applications, have become ubiquitous. Based on Moore’s Law, the number of transistors on a microchip doubles every two years, emphasizing the importance of early-stage power exploration. Designers can no longer hold up until the end-most netlist to get accurate power numbers. Estimating power consumption early in the design process is imperative to tackle these issues. This paper addresses the problem of robust early estimation of power since visibility is needed from the start of register transfer level (RTL) coding. Voltage domains, switchable power domains, and clock-gating techniques are examples of common power-saving approaches. Although voltage and power domains necessitate unique logic circuits and complicate power grid architecture, it is necessary to ensure that they are adequately met. The power estimation and exploration solution aid in estimating data across different intellectual properties (IPs) and circuits. These configurations entail conducting power estimates, profiling, and reduction iteratively to assess and enhance the power efficiency of the design. This paper aims at finding the significant power loopholes at the preliminary stage of the RTL, a time when the most rewarding modifications can be made, thereby saving both power and overall turnaround time.
Author/Speaker: Pei Yao, Principal Staff Engineer, AMD
Co-Authors: Lei Zhou, Nitin Navale, Stanley Chen, Peter Chen
Title: A Novel Simulation Flow for 7nm Mixed-Signal Design Transistor Level Verification
Abstract: A single simulation environment for both analog and digital circuits is developed. A new parallel simulator is utilized to perform functional verifications at transistor level. The efficiency of the flow is demonstrated using a PLL top-level testbench. This flow is proven efficient to detect potential bugs at early design phase.
Author/Speaker: Abir Bazzi, Principal Application Engineer, Infineon Technologies
Co-Authors: Dr. Di Ma, Dr Adnan Shaout
Title: Secure Software Update in Automotive Modern Software Architecture, Student Paper – University of Michigan, Dearborn
Abstract: Software Over-the-air Update (SOTA) has gained recently high interest in the automotive industry. Installing or updating a software on a device chip in the vehicle over the air is not as simple and easy as it might seem but actually there are many tricky operational and security aspects. Electronic Control Units (ECUs) may need to accept upgrades while the devices are working, this typically requires redundancy and additional resources in the devices, so that part of the ECU can be upgraded while the rest of the ECU continues to operate safely. An ECU can only run a new version of received software after empirically verifying the entire correct image file from the server. Such concept implies different constraints and requirements to the software development and update process. Externally to the ECU, it has to be made safe and secure to just update part of the software. Internally to the ECU, the updated software should be safe and secure to the remaining of the system and the ECU functionality. One of the used security mechanisms is code signing for each software image as it is critically important to verify that the contents in the image have not been tampered with, as well as to verify that the received image is from the intended publisher. Code signing is a technology which uses digital signature and the public key infrastructure to sign image files. The integrity of the system relies on securing both the private and public keys against outside access. Public keys are stored in the ECUs and require use of hardened cryptographic hardware products to protect these keys. Driven by new automotive trends (Autonomous driving, connected cars, shared mobility…), the new software architecture decomposes an ECU software image into multiple independent, distributed and somehow connected blocks or clusters. Thus, code signing becomes more complicated and demands more resources as well as coordination between the publishers and the endpoint ECUs in the vehicle. In our SOTA proposed scheme, we aim to optimize code signing and key management needed for such new architecture to meet the automotive ECU constraints as well as provide the necessary security to protect the vehicle against both passive and active attacks during software over the air updates. Working at the ECU level, our solution complements the existing SOTA frameworks and solutions by implementing a new Merkle Tree-based algorithm to additionally address dependencies and conflicts among the software entities within the ECU using a logical implicit signature verification without adding much overhead to the process and while keeping key management in the vehicle compliant with the existing process.
Author/Speaker: Akhila Shamsunder, Senior Staff Engineer, Automotive Software, Infineon Technologies
Co-Authors: Swasati Baishya, Karan Mundhra, Sandeep Chandrashekar
Title: Method for detection and mitigation of Control Flow and Data Flow Violations in Automotive Software
Abstract: The complexity of automotive software has only increased several folds over the years. Since the automotive systems are life critical it becomes only imperative that software quality is not compromised. Complex software can lead to increase in systematic software faults which could result in an undefined and dangerous behavior in the system. Hence, it becomes imperative to have more stringent mechanisms to avoid and detect faults early in the development without compromising on the use of software. The paper describes an approach to detect data access & control flow violation using static analysis. It also describes an additional test method to establish freedom from interference between software elements in a mixed criticality system.
Author/Speaker: Yujeong Shim, Silicon Engineer, Google Cloud, Google
Title: Design and technology spaces for heterogeneous chiplet integration
Abstract: Heterogeneous chiplet integration is an emerging technology to boost up computing power and build cost effective systems for HPC, AI and ML ASICs. In this paper, we introduce design and technology spaces for chiplet integration and proposal of further technology enhancement in interface design and packaging technology.
Author/Speaker: Janet Olson, Vice President Research and Development for Front-End Design, Cadence Design Systems
Title: Criticality of physically aware DFT for convergence of design of semiconductors
Abstract: The act of observation changes the system being observed. In design for test (DFT), the logic being inserted to observe & control the functional design can impact the overall power, performance, area (PPA), and congestion. As test has become more pervasive and mission critical, semiconductor PPA targets and schedules have also tightened. The ability for DFT logic insertion to be physically aware can not only help with design convergence but can also benefit the quality of test results. We will explore some novel applications of this physically aware test in the domain of compression & test point insertion, how it can help achieve a more optimal product & process, and implications for the future.
Author/Speaker: Manny Wright, Senior Consulting Engineer, Imperas Software
Title: Open-source is a great price, but verification is the real value
Abstract: RISC-V has taken the processor segment by storm since this open standard ISA project began in 2010. Its popularity in research institutes and academics soon spread to commercial implementations across all markets from the tiniest embedded applications to the performance critical AI server farms. RISC-V adopters are exploring processor design freedoms enabled by this open standard ISA. In turn, these design freedoms are also driving the interest in open-source hardware.
The OpenHW Group’s RISC-V based processors are commercial adoption ready open-source cores. Free is a great price but the real value is in the verification investment by OpenHW to ensure these cores are of industrial quality and compliant to the RISC-V ISA standard. Functional verification of open-source processor core IP to industrial grade for commercial adoption is discussed in this paper.
Author/Speaker: Bernice Zee, Senior Member of Technical Staff, AMD
Co-Authors: Angeline Phoa, Syahirah Zulkifli, Qiu Wen, Oh Ziying
Title: The Application of Machine Learning in the Next Frontier of Failure Analysis Fault Isolation
Abstract: The continued scaling of transistor technology in tandem with the rapid development of next generation packaging technologies has presented interesting challenges to current failure analysis techniques. Whether at the transistor or package interconnect level, the ability to localize and visualize defects prior to physical failure analysis (PFA) is essential for successful root cause analysis. Failure analysis (FA) typically follows a workflow to effectively isolate the failure and determine the root cause. It consists of electrical fault isolation (FI) techniques to localize an area of inspection, non-destructive testing (NDT) to look for defects followed by destructive physical failure analysis (PFA).
In recent years, machine learning methodologies have been applied to both FI and NDT steps of the FA workflow to help aid defect detection as defects have become more subtle and challenging to differentiate in denser and more complex semiconductor packages. Depending on the technique, different machine learning methodologies are applied. This paper provides an overview of recent applications of machine learning in the semiconductor FA workflow. Some of the areas covered include the use of machine learning based methodologies for computer vision (CV) based defect detection through images comparison between good and reject device, unsupervised and supervised learning techniques using independent component analysis (ICA) to analyze acoustic microscopy data, and deep learning based high resolution reconstruction technique 3D X-ray imaging.