Siunam's Website

My personal website

Home About Blog Writeups Projects E-Portfolio

where are the robots | Mar 3, 2023

Introduction

Welcome to my another writeup! In this picoGym challenge, you’ll learn: Reading web crawler file (robots.txt)! Without further ado, let’s dive in.

Background

Author: zaratec/Danny

Description

Can you find the robots? https://jupiter.challenges.picoctf.org/problem/60915/ (link) or http://jupiter.challenges.picoctf.org:60915

Enumeration

Home page:

Pretty empty.

In the challenge’s title and the home page, it’s referring to a file called robots.txt, which is a file for web crawler (an Internet bot that systematically browses the World Wide Web).

Let’s try to read that file:

┌[siunam♥earth]-(~/ctf/picoGym/Web-Exploitation)-[2023.03.03|18:14:29(HKT)]
└> curl https://jupiter.challenges.picoctf.org/problem/60915/robots.txt                           
User-agent: *
Disallow: /8028f.html

As you can see, it’s disallowing web crawler (not us) to view /8028f.html.

Let’s go there!

┌[siunam♥earth]-(~/ctf/picoGym/Web-Exploitation)-[2023.03.03|18:25:41(HKT)]
└> curl -s https://jupiter.challenges.picoctf.org/problem/60915/8028f.html | html2text

Guess you found the robots
picoCTF{ca1cu1at1ng_Mach1n3s_8028f}

We found the flag!

What we’ve learned:

  1. where are the robots