Due to the increasing overall complexity and integration, electronic engineers are faced every day with ever more complicated ElectroMagnetic Interference (EMI) issues. As a result many first prototypes fail to pass all EMI certification tests causing a big loss of time and profit. Up to now, debugging EMI issues has mostly been done in costly EMI chambers. Moreover these tests are done rather late in the design cycle when there is not much flexibility left to implement the optimal and most cost-efficient EMI mitigation methods. Simulations offer a lot a flexibility when estimating the EMI impact of different elements in the electronic system and can really help to find the real sources for possible EMI issues. By having this knowledge very early in the design stage, one can implement cheap, yet effective EMI mitigation methods without resorting to more costly EMI suppressors like shielded connectors, chokes, or specially-designed enclosures. Unfortunately, due to the complexity, most of EMI problems require excessive computer resources both in terms of simulation time as computer memory. However, thanks to recent advances in the adoption of GPU parallel processing technology to modern EM simulation tools, it becomes feasible to accurately solve complex EMI problems within a reasonable amount of time. This paper gives an overview of some efficient methodologies that are currently used by within Nvidia for cost-effective EMI suppression techniques by means of a virtual EMI lab and this both early in the design process or after physically testing a first prototype . The challenges that were successfully tackled include (i) the optimal routing of on a multi-layered Printed Circuit Board (PCB), (ii) the use of on-board shielding, (iii) the influence of grounding to a connector-to-PCB transition, (iv) simultaneous switching output (SSO) noise emission reduction, and (v) estimating the influence of the exact location of an ESD diode.

This content is only available as a PDF.