Abstract
An approximate analytical solution is presented, along with numerical calculations, for a system of two single-photon wave packets interacting via an ideal, localized Kerr medium. It is shown that, because of spontaneous emission into the initially unoccupied temporal modes, the cross-phase-modulation in the Schrödinger picture is very small as long as the spectral width of the single-photon pulses is well within the medium’s bandwidth. In this limit, the Hamiltonian used can be derived from the “giant Kerr effect” for a four-level atom, under conditions of electromagnetically induced transparency; it is shown explicitly that the linear absorption in this system increases as the pulse’s spectral width approaches the medium’s transparency bandwidth, and hence, as long as the absorption probability remains small, the maximum cross-phase-modulation is limited to essentially useless values. These results are in agreement with the general, causality-based, and unitarity-based arguments of Shapiro and Razavi [J. H. Shapiro, Phys. Rev. A 73, 062305 (2006); J. H. Shapiro and M. Razavi, New J. Phys. 9, 16 (2007)].
- Received 30 November 2009
DOI:https://doi.org/10.1103/PhysRevA.81.043823
©2010 American Physical Society