Integration of optical communication circuits directly into high-performance microprocessor chips can enable extremely powerful computer systems. A germanium photodetector that can be monolithically integrated with silicon transistor technology is viewed as a key element in connecting chip components with infrared optical signals. Such a device should have the capability to detect very-low-power optical signals at very high speed. Although germanium avalanche photodetectors (APD) using charge amplification close to avalanche breakdown can achieve high gain and thus detect low-power optical signals, they are universally considered to suffer from an intolerably high amplification noise characteristic of germanium. High gain with low excess noise has been demonstrated using a germanium layer only for detection of light signals, with amplification taking place in a separate silicon layer. However, the relatively thick semiconductor layers that are required in such structures limit APD speeds to about 10 GHz, and require excessively high bias voltages of around 25 V (ref. 12). Here we show how nanophotonic and nanoelectronic engineering aimed at shaping optical and electrical fields on the nanometre scale within a germanium amplification layer can overcome the otherwise intrinsically poor noise characteristics, achieving a dramatic reduction of amplification noise by over 70 per cent. By generating strongly non-uniform electric fields, the region of impact ionization in germanium is reduced to just 30 nm, allowing the device to benefit from the noise reduction effects that arise at these small distances. Furthermore, the smallness of the APDs means that a bias voltage of only 1.5 V is required to achieve an avalanche gain of over 10 dB with operational speeds exceeding 30 GHz. Monolithic integration of such a device into computer chips might enable applications beyond computer optical interconnects-in telecommunications, secure quantum key distribution, and subthreshold ultralow-power transistors.