When a 15-year-old boy vanished from New York City in early January, his family turned to the public for help. Police confirmed Thomas Medlin boarded a train to Manhattan on January 9, his last known location near Grand Central Station, after allegedly arranging to meet someone he met online through Roblox. Nearly a month later, he remains missing, and his mother has described his disappearance as completely out of character.
Yet Roblox’s response to the case—officially released through its corporate communications channel—reads less like a message of solidarity and more like a preemptive legal defense. The statement begins with standard corporate condolences but quickly shifts to a detailed breakdown of its internal review, effectively arguing that its platform played no role in the disappearance. While the company insists it has cooperated with law enforcement, the framing feels more concerned with damage control than genuine support for the family.
The company’s review, outlined in bullet points, claims no evidence was found of off-platform communication attempts, shared contact information, or suspicious chat activity. But the timing and tone of the statement—issued while the child remains missing—risks appearing dismissive rather than supportive. A more empathetic approach, even if legally necessary, could have emphasized Roblox’s willingness to assist in the search rather than its own innocence.
Police continue to urge anyone with information to contact the Suffolk County Police Department. The case serves as a stark reminder of the challenges platforms face balancing user safety with corporate accountability.
