Follow us on Twitter!
Imagination is more valuable than knowledge - Albert Einstein
Thursday, April 24, 2014
Navigation
Home
HellBoundHackers Main:
HellBoundHackers Find:
HellBoundHackers Information:
Learn
Communicate
Submit
Shop
Challenges
HellBoundHackers Exploit:
HellBoundHackers Programming:
HellBoundHackers Think:
HellBoundHackers Track:
HellBoundHackers Patch:
HellBoundHackers Other:
HellBoundHackers Need Help?
Other
Members Online
Total Online: 32
Guests Online: 25
Members Online: 7

Registered Members: 82895
Newest Member: kevy90
Latest Articles

Basics of Web Hacking

Arrow Image Just a few tips...



There are several articles here on hbh about hacking. Though I haven't seen any that explain what to look for, there mostly all for certain exploits. So anyways...

Whenever trying to hack a site, theres a few things you should do.

1. Look for common directories.
------------------------------------------------------
Whenever someone makes a site, they usually name directories by what they hold. Ex. If I was going to store info on ufo theories, I wouldn't name the dir. something that has nothing to do with ufo's. I would probably name it "ufo", or "ufos", or mayby "ufofiles". Anyway, just look for what you think the owner would have named it. Here are a few common directory names.

admin
bak
test
files
include
includes
images
members
users
tmp
logs


2. Look in the source.
-------------------------------------------------------
Webmasters often leave comments in the source. Now, while they may not leave their passwords in the source, they very often leave the login page in the source.
Ex.
<!-- /admin/admnlgn.php -->
Also, if there's more than one person working on the site, they might leave each other notes in the source.
Ex.
<!-- Hey Joe, im gonna get some sleep now, could you go ahead and finish up the login for me. -->
This tells you that the login isn't finished and may be exploitable. So as a rule, always look in the source, you never know what you'll find.


3. Look for a robots.txt.
-------------------------------------------------------
Google and other search engines index sites to make them searchable. This poses a problem for site owners who have things they'd rather not be searchable. So, to remedy this, web owners can place a file in their root directory called "robots.txt".
It might look something like this
***************************************
User-agent: *
Disallow:
Disallow: /includes/
Disallow: /logs/
***************************************
So, you now know there's two dirs. One named "includes" and one named "logs"
To see a real example of one, goto http://whitehouse.gov/robots.txt .

Also, if the site is owned by a decent sized company, you can bet that the dirs, users, and passes are gonna have to do with what the job is.
Ex.
Someone with the job of "service technician" might have a user name of "srvctch001" and a pass of "s3rv1c3t3ch". So, just mess around a bit.

Well, I hope this article is helpfull, It's aimed for beginners, so I hope it's easily understandable.

--Adlez

Comments

h4xguyon September 29 2006 - 02:37:26
Hey adlez that was great but I didn't understand it and I'm rating it Awesome!
adlezon September 29 2006 - 02:40:34
ShockPfft uhm. which part didnt u understand, must make it 1337 and sooper understandable.
CrazyCaity123on September 29 2006 - 03:17:19
You didnt understand it but rated it awesome? o.O Lol anyway, Good article. Very interesting.
enforceron September 29 2006 - 10:13:52
you should have added about the use of web spiders, and cgi bugs and cgi bug scanners... but overall, a very nice beginner article, and you also didn't spoon feed all the script kiddies.
godon September 29 2006 - 15:04:19
Someone with the job of "service technician" might have a user name of "srvctch001" and a pass of "s3rv1c3t3ch". So, just mess around a bit.
i think this is very rare to be found..
adlezon September 29 2006 - 18:01:22
Thanks. @god I got the db info for a pretty big company, and ther're passes were like that. Only thing is, I can't connect, not cause of the passes, but because it says that it can't find the server.
AbSoRbon September 29 2006 - 18:24:52
Nice informative article Smile
-The_Flash-on September 29 2006 - 21:11:36
Usually default passwords would be initials/d.o.b of emplyer. eg John Doe , 05-08-1979 JD5879 would be a pass
mozzeron September 29 2006 - 21:45:30
/me agrees with Flash, Great article, missing the 5 major web vulnerabilities though
Thiseason October 02 2006 - 19:22:19
Hmmm.... you sould solve more challenges (4 my op) to give a global review of all of them.... now you have only a small part of the... picture.
teresa_fuegoon October 10 2006 - 16:21:20
Great article and very informative for newbie, like me.
Lionzon January 29 2012 - 17:37:00
yes thanks that was helpful
Post Comment

Sorry.

You must have completed the challenge Basic 1 and have 100 points or more, to be able to post.