Impossibility of large phase shifts via the giant Kerr effect with single-photon wave packets

Julio Gea-Banacloche
Phys. Rev. A 81, 043823 – Published 16 April 2010

Abstract

An approximate analytical solution is presented, along with numerical calculations, for a system of two single-photon wave packets interacting via an ideal, localized Kerr medium. It is shown that, because of spontaneous emission into the initially unoccupied temporal modes, the cross-phase-modulation in the Schrödinger picture is very small as long as the spectral width of the single-photon pulses is well within the medium’s bandwidth. In this limit, the Hamiltonian used can be derived from the “giant Kerr effect” for a four-level atom, under conditions of electromagnetically induced transparency; it is shown explicitly that the linear absorption in this system increases as the pulse’s spectral width approaches the medium’s transparency bandwidth, and hence, as long as the absorption probability remains small, the maximum cross-phase-modulation is limited to essentially useless values. These results are in agreement with the general, causality-based, and unitarity-based arguments of Shapiro and Razavi [J. H. Shapiro, Phys. Rev. A 73, 062305 (2006); J. H. Shapiro and M. Razavi, New J. Phys. 9, 16 (2007)].

  • Figure
  • Figure
  • Received 30 November 2009

DOI:https://doi.org/10.1103/PhysRevA.81.043823

©2010 American Physical Society

Authors & Affiliations

Julio Gea-Banacloche

  • Department of Physics, University of Arkansas, Fayetteville, Arkansas 72701, USA

Article Text (Subscription Required)

Click to Expand

References (Subscription Required)

Click to Expand
Issue

Vol. 81, Iss. 4 — April 2010

Reuse & Permissions
Access Options
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review A

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×