Category Archives: CTF

One of These is Not Like the Other

NYU-Poly hosts an annual Capture the Flag (CTF) competition called CSAW (Cyber Security Awareness Week). They have a qualification competition and then a final competition for the top few teams from the qualification round. Our team, the Whitehatters Computer Security Club at the University of South Florida (WCSC for short), competed in the qualification CTF.

There were a number of categories in this particular CTF, including trivia, reconnaissance, web, reversing, exploitation, forensics, and networking. Several people on my team looked at the forensics challenges, but one of them eluded all of us. The title of the challenge was “One of These is Not Like the Other” and consisted of a simple PNG image. The original image shown below:

CSAW 2012 Qualification - Forensics 200

At first glance, we all assumed some sort of steganography, since that is the practice of hiding messages inside of images or audio files. After a bit of steganography analysis on the image, I concluded the actual picture was irrelevant and was intended to be a red herring.

A common tool used in CTF challenges is called strings. Running strings on this picture generates output that contains many interesting fields of the format:

key{FIRSTNAME LASTNAME}

(The full output can be viewed here: https://gist.githubusercontent.com/billymeter/c09747733d5810953e49/raw/58681a0fa59e51048e12e9d2234439053f29a1d2/gistfile1.txt)

To defeat the challenge, a key, or flag, must be submitted to the scoreboard. There are plenty of keys in this file, but which one is the correct one? Running the flowing command:

brad@bt[~/Desktop]
[15:22]: strings version1.png | grep key{ | wc
     500    1001   10490

shows that there are 500 keys in this file! Which one is the correct one?

I search for some sort of pattern with the names themselves, but never found one. Thinking back the the challenge category, forensics, prompted me to actually research the technical file format for a PNG file.

Reviewing at the PNG technical specification, found here: http://www.libpng.org/pub/png/spec/iso/index-object.html, shows that PNG files are composed of data structures called chunks. Opening up the image file into a hex editor, we can see that all of these keys embedded in the file are tEXt chunks as shown below:

hex

Rather than part of the image, or the names in the text chunks, being “not like the others,” perhaps something with one of these tEXt chunks is not like the other. If the chunks in this PNG file are not formatted properly, then surely there is a tool to help us find the bad chunks.

Luckily, there is. The particular tool that I used is called pngcheck. It can be found here: http://www.libpng.org/pub/png/apps/pngcheck.html. This tool will scan and make sure that a PNG image is formatted properly. Running pngcheck on the image:

brad@bt[~/Desktop]
[16:10]: pngcheck version1.png
version1.pngĀ  CRC error in chunk tEXt (computed 5005ed3c, expected 26594131)
ERROR: version1.png

shows that there is an error with a tEXt chunk! But which one is it? We will check the help documentation for pngcheck:

brad@bt[~/Desktop]
[16:11]: pngcheck
PNGcheck, version 2.3.0 of 7 July 2007,
   by Alexander Lehmann, Andreas Dilger and Greg Roelofs.

Test PNG, JNG or MNG image files for corruption, and print size/type info.

Usage:  pngcheck [-7cfpqtv] file.{png|jng|mng} [file2.{png|jng|mng} [...]]
   or:  ... | pngcheck [-7cfpqstvx]
   or:  pngcheck [-7cfpqstvx] file-containing-PNGs...

Options:
   -7  print contents of tEXt chunks, escape chars >=128 (for 7-bit terminals)
   -c  colorize output (for ANSI terminals)
   -f  force continuation even after major errors
   -p  print contents of PLTE, tRNS, hIST, sPLT and PPLT (can be used with -q)
   -q  test quietly (output only errors)
   -s  search for PNGs within another file
   -t  print contents of tEXt chunks (can be used with -q)
   -v  test verbosely (print most chunk data)
   -x  search for PNGs within another file and extract them when found

Note:  MNG support is more informational than conformance-oriented.

If we run pngcheck with the -7 flag, then we should see where the bad tEXt chunk is.

brad@bt[~/Desktop]
[16:11]: pngcheck -7 version1.png
File: version1.png (1443898 bytes)
XML:com.adobe.xmp:
    (no translated keyword, 393 bytes of UTF-8 text)
comment:
    key{rodney danielle}
comment:
    key{matthieu blayne}
<<<<<<<<<<<<<<<<<<<<<<<<<<<<< SNIP >>>>>>>>>>>>>>>>>>>>>>>>>>>>
comment:
    key{nguyen willie}
comment:
    key{takeuchi gregory}
version1.png  CRC error in chunk tEXt (computed 5005ed3c, expected 26594131)
ERROR: version1.png

So we see that key{takeuchi gregory} is the tEXt chunk that was bad. Submitting “takeuchi gregory” to the scoreboard yielded in solving the challenge.