Apple is resuming the use of humans to review Siri commands and dictation with the latest iPhone software update.
In August, the company suspended the practice and apologised for the way it used people, rather than just machines, to review the audio.
While common in the tech industry, the practice undermined Apple’s attempts to position itself as a trusted steward of privacy.
Chief executive Tim Cook has repeatedly declared the company’s belief that “privacy is a fundamental human right”, a phrase that cropped up again in Apple’s apology.
Now Apple is giving consumers notice when installing the update, iOS 13.2, that they can choose “Not Now” to decline audio storage and review.
Users who enable this can turn it off later in the settings.
Tech companies say the practice helps them improve artificial intelligence services.
But the use of humans to listen to audio recordings is particularly troubling to privacy experts because it increases the chances that a rogue employee or contractor could leak details of what is being said, including parts of sensitive conversations.
Apple previously disclosed plans to resume human reviews this autumn, but had not specified when. The firm also said then that it would stop using contractors for the reviews.
Other tech companies have also been resuming the practice after giving more notice. Google restarted in September after taking similar steps to make sure people know what they are agreeing to.
Also in September, Amazon said users of its Alexa digital assistant could request that recordings of their voice commands delete automatically.