There is something slowly changing around us—and often we don’t even realize it. Our children are still laughing and playing. But the places where they play have changed. It’s no longer just outdoors, but also behind screens. And behind those screens lies a vast, boundless world—and unfortunately, it is not always safe.
That world is called digital space. There, children can learn, make friends, and play. But there are also people with bad intentions—who look for weaknesses, approach children subtly, and eventually harm them. Activists call them “predators.”
A Kompas investigation (2026) reveals a shocking fact. Child sexual exploitation now frequently occurs through online games, social media, and AI technology that can create fake images that look completely real.
The perpetrators’ methods are often simple. They start with ordinary conversations. Friendly. Like a friend. Over time, they build trust (grooming). For example, the perpetrator pretends to be a peer, chats frequently, gives attention, and then slowly asks for personal things. After that, they gradually request private matters—photos, videos, or sensitive stories. Once they obtain those, the threats begin.
What should make us truly vigilant: this can happen anywhere. Even inside our own homes. When children appear “safe” and “comfortable” because they are just playing games in their rooms.
The Problem is Big—and Real
Global research shows that about 1 in 12 children worldwide experience online sexual exploitation each year (Fry et al. 2025). In Indonesia, the number of cases is also quite high and continues to rise (Kompas 2026; NCMEC 2025).
Now, with the presence of AI, the situation has become even more complicated. Ordinary photos or videos that we upload to social media can be taken, altered, and turned into inappropriate content. They can even be used to threaten our children (IWF 2026). This means the problem is not only what actually happens, but also what can be made to happen.
Parents Often Don’t Realize
We must be honest: many of us as parents are often too trusting. Because we are busy working, managing the household, or simply wanting peace and quiet, we hand gadgets to our children so they will “stay quiet.” Without realizing it, we are handing our children over to a world we ourselves do not understand.
Yet children have a great sense of curiosity. They will try many things. And in the digital world, they can encounter things they are not ready to face.
Letting children roam freely without guidance in the digital world is like releasing them into a forest—without a map, without protection.
The State Has Moved, But It’s Not Enough
The government has created regulations, such as Government Regulation Number 17 of 2025 concerning the Governance and Protection of Children in Digital Space (PP Tunas 2025), to protect children in digital spaces. This is a good step: children under 16 years old have restricted access to high-risk digital platforms (Republic of Indonesia 2025).
However, in reality, many people are not even aware that this regulation exists. And like many other rules in this country, the problem is often not in the regulation itself—but in its implementation. Without supervision, cooperation from digital platforms, and parental involvement, regulations become nothing more than words on paper.
The state has many instruments and authorities to protect children. Rules and laws are only a small part of the authority the state possesses. If there is political will, it can be done. The proof is that Indonesia once became the first country in the world to boldly block Grok AI because of the existence of explicit images of women and children without consent in Elon Musk’s AI application (DW 2026).
Technology Companies Must Also Be Responsible
From the Grok AI case, we can learn that the role of social media platforms, chat applications, and online games is extremely important. These platforms cannot view their applications merely as entertainment venues. They have systems—chat features, algorithms, and user connections—that can be exploited by criminals. Thus, indirectly, these platforms become facilitators of criminal acts if they lack awareness and concern to provide and introduce protection features (UNICEF Innocenti 2025; WeProtect Global Alliance 2025).
Research shows that without good safety systems (safety by design), technologies like AI can actually be used to create fake child sexual content (Thorn 2025; NCMEC 2025).
The problem is that companies often focus more on profit than on safety. Yet they have a great responsibility: to protect users, especially children.
Parents Need Life Balance
Research also shows that children are more vulnerable when parents are too busy. Many children end up spending more time alone, playing on the internet without supervision (Muller et al. 2023; ECPAT, INTERPOL, and UNICEF 2022). This is not just a family issue. It is also a workplace system issue.
Here, the role of companies or offices where parents work is very important. If they provide humane working hours, sufficient family time, and support for parents, the risks to children can be reduced (Friedman 2001; Family Forward NC 2024).
Conversely, if parents are constantly forced to work without time for family, children will seek “friends” in the digital world—and that is where the risks emerge.
This is indeed a dilemma. On one hand, employers help families meet their living needs. But on the other hand, time spent with children is greatly reduced. Especially if both father and mother are working to provide for the family.
Not a few parents come home exhausted from work and still have to continue with household chores, including childcare. Often, child-rearing becomes the last priority. Children are handed over to gadgets.
The Church Must Not Stay Silent
For Christians, this is not just a social issue. It is a matter of faith. The Bible teaches that every child is a precious creation of God (imago Dei, Genesis 1:26–27). Harming a child means damaging something very valuable in God’s eyes. Jesus Himself spoke strongly about this (Matthew 18:6).
Schmutzer states that sexual abuse is an act that distorts the created order because it damages the body, soul, and human relationships. Abuse also violates the mandate of stewardship (Genesis 1:28) and twists authority into oppression (Schmutzer 2008).
Therefore, the church must not only speak but must act. The church is called to be a community of healing. The World Council of Churches (WCC) affirms that protecting children in digital space is an integral part of the church’s mission in the technological era, where the church must provide moral leadership, strong advocacy for child protection policies, and extend its historic commitment to education and social care into the virtual realm (World Council of Churches 2026).
International Justice Mission (IJM)—a global Christian organization—has been actively fighting online sexual exploitation of children (OSEC) with a faith-based approach, emphasizing that technology must be a tool for justice, not a weapon of exploitation (International Justice Mission 2026).
In Indonesia, the Communion of Churches in Indonesia (PGI) has firmly encouraged all its members to respond to the “emergency of child sexual violence,” including the shift in modus operandi to the online realm, through the establishment of child protection policies (safeguarding policy), Child-Friendly Church (GRA) programs, increased pastoral capacity for victims, and cross-sector collaboration with the government (Persekutuan Gereja-Gereja di Indonesia 2025).
From the perspective of Christian Ethics, responsible technology management is also required (Crouch 2017). AI is not neutral; when used to create deepfake CSAM, it violates the principles of truth, love, and protection of the weak (James 1:27; Micah 6:8).
Therefore, the church must continually make itself a safe place for children. Sunday School activities need to be encouraged to better optimize children’s creativity, without always depending on gadgets. Children should be trained to maximize all their mental, emotional, and motor abilities as God’s priceless gifts.
The church also needs to educate parents and children about the dangers of the digital world without being anti-technology. Technology must be seen as a tool, whose use fully depends on human intelligence and morality.
In handling victims, the church must accompany them, not judge them. The church must collaborate with the government and various elements of society to find joint solutions.
Parents: Be a Friend, Not Just a Supervisor
Many parents feel unprepared to face the digital world. That is understandable. But children do not need parents who are the most tech-savvy. They need parents who are present.
The issue is not just that children use the internet. The issue is when children are alone there. Simple things parents can do:
- Know who your child is interacting with (including online). Occasionally check who they are playing with online;
- Build open communication. It is better to place gadgets in the family room, not in the bedroom;
- Teach boundaries about the body and privacy. Children are often too easily influenced by trends on social media, which are sometimes too “vulgar” for others, thus arousing the desire to tempt children. Simple things, such as children’s appearance and gestures, need to be continuously monitored;
- Accompany, do not just forbid. When children start doing inappropriate things, whether in speech, gestures, or appearance, parents must be able to guide them.
Andy Crouch (2017) reminds us that technology is not the enemy. But if not guided, it can take over our lives.
The most important thing is actually simple: Children need to be heard. Children need to be trusted. Children need to be accompanied. Often, children become victims not because they are “naughty,” but because they feel alone.
Conclusion
Digital space can be dangerous. But it can also be safe—if we protect it together.
The state needs to be firm. Technology companies must be responsible. The church must care. And parents must be present. Because in the end, children are not just internet users. They are individuals we must protect.
References
Crouch, Andy. 2017. The Tech-Wise Family: Everyday Steps for Putting Technology in Its Proper Place. Grand Rapids, MI: Baker Books.
DW. 2026. “Pertama di Dunia, Indonesia dan Malaysia Blokir Grok AI.” DW, April 2026. Accessed April 22, 2026. https://www.dw.com/id/pertama-di-dunia-indonesia-dan-malaysia-blokir-grok-ai/a-75472912.
ECPAT, INTERPOL, and UNICEF. 2022. Disrupting Harm in Indonesia: Evidence on Online Child Sexual Exploitation and Abuse. Bangkok: ECPAT International.
Family Forward NC. 2024. “Preventing Child Abuse Through Family-Friendly Workplace Policies.” Accessed April 22, 2026. https://familyforwardnc.com/preventing-child-abuse-through-family-friendly-workplace-policies/.
Friedman, Dana E. 2001. “Employer Supports for Parents with Young Children.” The Future of Children 11, no. 1: 63–77.
Fry, D., et al. 2025. “Prevalence Estimates and Nature of Online Child Sexual Exploitation and Abuse: A Systematic Review and Meta-Analysis.” The Lancet Child & Adolescent Health. https://doi.org/10.1016/S2352-4642(24)00329-8.
International Justice Mission. 2026. “Online Sexual Exploitation of Children.” Accessed April 22, 2026. https://www.ijm.org/our-work/trafficking-slavery/online-sexual-exploitation-children.
Internet Watch Foundation (IWF). 2026. “Harm without Limits: AI Child Sexual Abuse Material through the Eyes of Our Analysts.” https://www.iwf.org.uk/media/hl1nvdti/iwf-ai-csam-report-2026.pdf.
Kompas. 2026. “Gim hingga AI, Ladang Eksploitasi Predator Anak.” Harian Kompas and Kompas.id, April 20, 2026.
Muller, Karen, Astrid Gonzaga Dionisio, and Sanghyun Park. 2023. Online Knowledge and Practice of Parents and Children in Indonesia: Baseline Study. Jakarta: UNICEF Indonesia. https://www.unicef.org/indonesia/media/23586/file/online-knowledge-practice-parents-and-children-Indonesia-baseline-study-2023.pdf.
National Center for Missing & Exploited Children (NCMEC). 2025. 2024 CyberTipline Report. https://www.missingkids.org/cybertiplinedata.
Persekutuan Gereja-Gereja di Indonesia. 2025. “Darurat Kekerasan Seksual Anak: Gereja Didorong Aktif Lindungi Anak.” April 30, 2025. https://pgi.or.id/news/warta-pgi/darurat-kekerasan-seksual-anak:-gereja-didorong-aktif-lindungi-anak-16.
Republik Indonesia. 2025. Peraturan Pemerintah Nomor 17 Tahun 2025 tentang Tata Kelola dan Perlindungan Anak di Ruang Digital. Jakarta: Pemerintah Republik Indonesia. https://jdih.komdigi.go.id/produk_hukum/view/id/965/t/peraturan+pemerintah+nomor+17+tahun+2025.
Schmutzer, Andrew J. 2008. “A Theology of Sexual Abuse: A Reflection on Creation and Devastation.” Journal of the Evangelical Theological Society 51, no. 4: 785–812.
Thorn. 2025. Safety by Design: Annual Progress Report April 2024–April 2025. https://info.thorn.org/hubfs/Thorn_SafetyByDesign_AnnualProgressReport_April2024-April2025.pdf.
UNICEF Innocenti. 2025. Protecting Children in Online Gaming: Mitigating Risks from Organized Violence. Florence: UNICEF Office of Research – Innocenti.
WeProtect Global Alliance. 2025. Global Threat Assessment 2025. https://www.weprotect.org/news/experts-unveil-practical-plan-to-end-technology-facilitated-child-sexual-abuse-crisis/.
World Council of Churches. 2026. “Digital Childhood Protection: Europe and the Church’s Mission.” Blog post, March 3, 2026. https://www.oikoumene.org/blog/digital-childhood-protection-europe-and-the-churchs-mission.