The rapid proliferation of Internet of Things (IoT) devices has introduced significant challenges, particularly in managing electronic noise, which can degrade system performance and data accuracy. This study explores the integration of IoT with edge computing as a novel approach to mitigate electronic noise. By leveraging edge computing, data processing is performed locally, reducing transmission distances and enhancing real-time noise reduction. The research employs a combination of field experiments and computational simulations to evaluate the effectiveness of this integration. Key metrics such as noise reduction level, system latency, energy efficiency, and signal quality were measured before and after the implementation of noise mitigation technologies. Results indicate a significant reduction in electronic noise, with noise levels decreasing from 75 dB to 55 dB (a 26.67% reduction). System latency improved by 66.67%, dropping from 120 ms to 40 ms, while energy efficiency increased by 30%. Signal quality, measured by Signal-to-Noise Ratio (SNR), improved by 100%, from 10 dB to 20 dB. These findings demonstrate that the integration of IoT and edge computing effectively mitigates electronic noise, enhances system performance, and optimizes energy consumption. The study concludes with recommendations for further optimization of signal processing algorithms and testing in more complex industrial environments.