explicit in recent months. it started with content that was basically images sexualising minors, but in a few recent months, he became very much more explicit, showing children in very, very sensitive and abusive pictures. 50 it is moving at sensitive and abusive pictures. so it is moving at a fairly fast paced, isn t it? how widespread is that? we found isn t it? how widespread is that? - found that the content what was shared actually doubled during the first quarter of 2023. and we saw a lot of cases in dark where forms and messaging app groups where people sell the content, sometimes thousands of pictures of children, and it is widespread, and we found more and more examples like this in the past few months. so more and more examples like this in the past few months. more and more examples like this in the past few months. so come i guess the big question the past few months. so come i guess the big question is, the past few months. so come i guess the big question i
a bbc investigation has found that paedophiles are using artifical intelligence and virtual reality technology to create and sell child sexual abuse material. they market the illegal content using accounts on mainstream platforms. responding to the bbc s investigation, the government says ai abuse is no different to other kinds , and tech companies will also be required to identify and remove it. live now to tel aviv and dr guy paltieli, senior child safety researcher at activefence, a safety provider for online platforms. welcome to you. thanks very much for being with us. what have you found in this area in your research? hello, good evening. what we found was a very distinctive trend which started at the beginning of the year, mark contents generated by ai tools, which became much more