what are the three most effective programming languages for hacking in your view?
#1
So if you had to pick just three primary programming languages to hack with, what would they be and why?

I don't know myself because I'm not yet ready to hack.

Thanks for the feedback btw. I'm just curious to see what the opinions are for this?
Reply
#2
Overall, I would say knowing 1 systems programming language, and 1 scripting language would give you the widest range of knowledge/capabilities. So something like C or C++ and Python.

But it really depends on what your focus will be. If all you're doing is hacking web apps, probably don't need to know C/C++. Python might be useful in some instances, especially if you're writing exploits, but in that case I might suggest Javascript, PHP being thats largely what drives the internet.

If you're hacking mobile apps then I'd probably say Swift, Java and Javascript are your 3 most useful languages to learn.

If you're trying to find vulnerabilities in server applications, network stacks, or operating systems, then C/C++, python.

So like everything... it all depends.
Reply
#3
(04-18-2020, 01:01 AM)MuddyBucket Wrote: Overall, I would say knowing 1 systems programming language, and 1 scripting language would give you the widest range of knowledge/capabilities. So something like C or C++ and Python.

But it really depends on what your focus will be. If all you're doing is hacking web apps, probably don't need to know C/C++. Python might be useful in some instances, especially if you're writing exploits, but in that case I might suggest Javascript, PHP being thats largely what drives the internet.

If you're hacking mobile apps then I'd probably say Swift, Java and Javascript are your 3 most useful languages to learn.

If you're trying to find vulnerabilities in server applications, network stacks, or operating systems, then C/C++, python.

So like everything... it all depends.

So would you say a combo of Python, C++, web languages, and command line languages would be ideal?

(04-18-2020, 01:01 AM)MuddyBucket Wrote: Overall, I would say knowing 1 systems programming language, and 1 scripting language would give you the widest range of knowledge/capabilities. So something like C or C++ and Python.

But it really depends on what your focus will be. If all you're doing is hacking web apps, probably don't need to know C/C++. Python might be useful in some instances, especially if you're writing exploits, but in that case I might suggest Javascript, PHP being thats largely what drives the internet.

If you're hacking mobile apps then I'd probably say Swift, Java and Javascript are your 3 most useful languages to learn.

If you're trying to find vulnerabilities in server applications, network stacks, or operating systems, then C/C++, python.

So like everything... it all depends.

Would you say that a mix of web languages, python, C++, and maybe some command line would be ideal?
Reply
#4
(04-18-2020, 06:43 AM)QMark Wrote: Would you say that a mix of web languages, python, C++, and maybe some command line would be ideal?

Ideal is in the eye of the beholder. I made these recommendations because:

1. Systems Programming teaches you about how the operating system, CPU, and memory work.

2. A scripting language like Python is useful for quickly automating tasks or getting out code. It's cross platform, so you don't need to redevelop code for different kernels. Python is the wide-spread language of the day, but you could use Perl, PHP, Ruby, etc for the same effect. PHP, Perl, Ruby, Python can all be used to build websites as well.

3. Javascript because Javascript is fucking everywhere the last few years. It's turned into a fairly robust language. There's not a whole lot you can't do with it these days.

By all means, if you want to add to this, it's not going to hurt. I recommended Python over PHP because I think it allows you a wider range of capabilities, but if you want to do PHP instead, you can. 

These 3 will just give you your biggest bang for your buck in terms of learning, and capability.
Reply
#5
I would say python, C, and ASM. Python because it's great for whipping up a tool and it's so widespread. C because it's fast, widespread, good for tools, good for developing exploits, malware, etc. ASM is most useful for exploit devs and reverse engineers but it's still useful to know.
Reply
#6
Can't say much since I don't use much other than python. But I definetly I agree with Python being a great language. You can make quick and easy scripts to do anything really.
Reply
#7
I was going to give out the same recommendations as MuddyBucket, so Python, C, C++. Since he already did, i am just reiterating at this point.

(04-24-2020, 07:30 PM)Dismal_0x8 Wrote: I would say python, C, and ASM. Python because it's great for whipping up a tool and it's so widespread. C because it's fast, widespread, good for tools, good for developing exploits, malware, etc. ASM is most useful for exploit devs and reverse engineers but it's still useful to know.

It's important to remember ASM isn't exactly one language. There's some overlap but it comes in dialects related to various architectures.
Reply
#8
Vector Wrote:It's important to remember ASM isn't exactly one language. There's some overlap but it comes in dialects related to various architectures.
x86 more specifically. So I'm curious why you chose C++. I don't hear a whole lot about it.
Reply
#9
I'd go with C, python and javascript. Control those and you can do a lot of damage!!
Reply
#10
Old thread, but because of the gravedig I'll drop my two cents on the subject.

For the most part, it's the same as everyone else said, except you could just use JS as a substitute entirely for Python in specific use cases.

Actually, that's about it. Whatever languages you learn will be entirely varied based on what you're doing specifically.



Systems languages are good to learn just to understand systems at a lower level in general. But, I'll disagree with another reply saying that it teaches you how an OS works. That's rarely the case unless you're doing anything specifically related to manipulating data using an operating system's respective API.

In other words, unless you're trying to allocate memory, load/reflect a binary into that segment of memory, mark it as executable memory then either spawn a new process using either the OS API (i.e. WinAPI) or another syscall like linux's fork() then you can learn C without needing to learn WinAPI or the Linux API.

I tend to recommend C over C++ just because learning C is much more simpler than C++ and has way less overhead. For instance, in C, you can only declare a fixed size array, and when the process runs and tries to allocate memory for that array, it finds the first unused segment of memory that has the space to fit that array.

An array of integers in C (signed 32 bit) of size 20 will try to find:
4 (bytes per integer) * 20 (size of array) = 80 bytes free
to allocate that memory.

Whereas in C++, you could use std::vector to create a variable sized array, where the program will first find a place to allocate the first part of the array, and if that memory area runs out (i.e. if you declare another vector/array) then will instead have a pointer to the next area of allocation for that array.

But that creates overhead, and although vectors may be easier to use than arrays, it's more for you to learn, and creates more overhead in your program by how much memory is being used, how the computer will treat each piece of data, and so forth.

C will always be dominant in terms of execution speed and general optimization with gcc (or if you use musl-libc over glibc, like me) but if you're creating more complex applications like anything that uses WinAPI (lots of structs and function pointers) then C++ might be preferable in the ease-of-use of manipulating that data, storing it, and using classes to organize your application in general. But to use C++ properly and keep it running safely and quickly, then you need a lot more experience programming. Most senior engineers will tell you that getter/setter functions in classes are generally frowned upon in C++, while you might think it's normal in every Java textbook/course you find recommending it.

The reason is simple, a getter function might make sense in a banking application to retrieve an account balance, but you wouldn't use a setter to change the balance in a deposit or withdrawal. Rather, you would have a function to either add funds for a deposit or subtract funds for a withdrawal.

There's only one relevant quote I can give on the C or C++ debate:
"C makes it easy to shoot yourself in the foot. C++ makes it harder, but when you do, you blow your whole damn leg off."



On the subject of scripting languages, yeah, Python can get you pretty far in the realm of prototyping. But, in my opinion, that's all I would honestly use it for and not create massive applications with it. There are exceptions, IIRC Dropbox's entire codebase is written in Python, but I think you'd all agree that Python isn't provable or reliable in terms of operability and maintaining.

Now, you're all fuming from that statement, but let me elaborate.

When you're writing a script to brute-force a login page, for instance, you're probably writing this in your first few lines:
Code:
import requests

It's an easy-to-use library to make any sort of networking request easy. I mean, hell, what you can do with two lines of requests-code is what I'll do with 100 lines of C code using netinet, netdb.h, sys/socket.h, etc.

But my point is that those libraries are RARELY updated unless fixing security issues. Not to mention, the C language itself has a very limited amount of possible error messages when debugging your code. And when something doesn't work, there's a very specific reason for it to not work which makes sense when you plug it into gdb or windb to get the raw bytecode and disassembly.

But if the maintainers of requests one day decided to make an update and it broke your code without you changing your part of the code at all that uses requests, you'd freak and because of the possibly endless error messages Python can throw at you, you'd have a hard time figuring out what's going on. And it's harder to plug a python script into a debugger than a native binary just because of how the interpreter converts the code to its own bytecode.

So, in reality, when you import libraries that are made by third-parties, you're adding another layer of uncertainty to your code.

Will it always break? No. Rarely, actually. But the possibility lies there, and it's not the library maintainer's problem if your code doesn't work when you use their library.

Python is great for prototyping. Like I said, what I do in 100 lines of C, you can do in 2. But at the end of the day, your code has a chance of breaking after an update, whereas my code likely won't ever break because *libc is already pretty well optimized and rarely gets any updates in regards to that, and if my code does break, it's much easier to figure out why it breaks specifically and go fix it myself.

(Yes, I acknowledge the data science field and their use of Python, but even then a lot of data scientists have been making a push towards using R because of its speed and reliability. Python is still used just because data scientists aren't programmers and Python makes it easier for them to write something that calculates what they need it to calculate.)



JavaScript is a bit of a love-hate relationship for me. Sure, it's everywhere, but just because it's everywhere doesn't make it any good.

It's a buggy piece of shit that doesn't have any consistency in terms of its implicit soft-typing.

Open your browser console and type this in:
Code:
console.log('5' - 3)

You should get 2 returned (5 - 3 = 2)

Now type this in:
Code:
console.log('5' + 3)

Watch the magic as the system freaks and returns 53.

Despite declaring the first example of 5 as a character, it gets treated like an integer.
In the second example, the interpreter decides that 3 becomes a char and concatenates it to the char 5 to create 53.

Side note: If you did that in C, '5' - 3 would make 2 anyway because the ASCII code for 5 is 0x35, and the code for 2 is 0x32 which is 3 less than the code for 5. Likewise, it would return 8 for the same reason, but '5' + or - anything greater than 5 will return other arbitrary characters in the ascii table causing issues, thus is bad practice.

So again, JS isn't provable because the whole system it's built on is too implicit and creates way too much 'undefined behaviour'. Sure, it's not really undefined because you can figure out why it acts like it does, but it's inconsistent in that regard. And, with the advent of Node.JS to make desktop applications with JS, it could possibly replace Python if you had to choose between learning the two, but at the end of the day will fall to the same pitfalls as Python like I outlined above.



Assembly is only relevant if you're doing reverse engineering, embedded development, or malware development/analysis. It holds no other purpose in the modern day. And straight x86 is outdated, everyone's on 64bit machines nowadays. sure, x64 is based off x86, but it has enough differences to call itself a whole different architecture. Compare the original 8088 opcode listings and manuals with everything added afterwards and the dawn of amd64/i64 and you'll see the difference. Even now, in the i64 manuals, the majority of it only applies to i64 and not IA32.

The most notable reason for incompatibility is the use of int 80h in x86 versus the syscall instruction in x64, and although you might think there's some degree of backwards compatibility, that's usually only at OS level since syscall compiles to the bytecode 0F 05, while int 80h is CD 00 (00 being the actual interrupt number for the OS API to understand what the interrupt is for.) The OS might understand what's happening for compatibility purposes, but at its rawest level (at least in the world of embedded development) it's really not the same and an x86 binary will not run true native on an x64 processor without OS abstraction like running FreeRTOS as a compatibility layer on an STM32 chip (my microcontroller of choice.)



So to go back to the original question: 1 systems and 1 scripting is usually the way to go. systems more for learning unless you plan on advancing to malware development or OS manipulation, but a scripting language should be known just to quickly write up something to prove that you can do something before going the whole 9 yards to build something more robust natively.
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  programming books thunder 0 1,697 04-25-2019, 11:28 PM
Last Post: thunder
  Which programming language is good for my need Maxpatricos 2 4,331 09-25-2018, 03:09 PM
Last Post: enmafia2
  Programming from the Ground Up Insider 2 5,986 07-29-2018, 02:44 PM
Last Post: Insider
  is this a realistic and/or good plan to learn programming? QMark 2 6,497 02-27-2018, 11:16 PM
Last Post: ekultek