Facebook, now Meta, is probably one of the most scrutinised companies in the world. Perhaps rightly so, as the company has had to tackle a recent barrage of bad publicity about their practices – putting profits before children’s mental health – and even before this had been challenged about their plan to launch a version of Instagram for kids. This article in the NY Times highlights the defensive argument that at present many children get around the age requirement on Instagram, so that surely it “would be better to develop a version more suitable for them” which does make some sense and supports reasons for having the app. However, a report published in July, by 5Rights Foundation, a London-based group focused on digital rights issues for children, showed that children from age 13 were targeted, 24 hours after setting up an account, with inappropriate and potentially harmful content, including material about eating disorders, body shaming, sexualized imagery, suicide etc. Clearly there is a major problem. In response Facebook have “pledged to introduce new parental control features for the existing Instagram app in the coming months while the company continues to consider a version for children”. At Escudo Web we take the concerns about platforms like FB very seriously, and our Classroom Manager solution not only prevents children from accessing inappropriate content while at school (as managed by the IT Manager) but also enables a parent to manage the access their child has outside of school hours, enabling them to either maintain the level of school filter or customise this according to their own views and beliefs. This involves parents directly in the education of their children using a remote app on their phone, without ever needing to look over the child’s shoulder or take away a device. It is an easier way to protect children from online harms rather than relying on self-regulation by platforms, which as we all know now, are more concerned with profits than anything else. 

#facebook #safeguarding