Go Back   Computer Forums > General Computing > Programming
Click Here to Login
Join Computer forums Today


Reply
 
Thread Tools Search this Thread Display Modes
 
Old 02-18-2015, 03:15 AM   #1
Beta Member
 
Join Date: Feb 2015
Location: US
Posts: 5
Default Binary

Ive never had any technical training in programming but is Binary an actual programming language? I know network guys use it but do programmers?
__________________

__________________
Mysterionymous is offline   Reply With Quote
Old 02-18-2015, 10:35 AM   #2
Site Team
 
celegorm's Avatar
 
Join Date: Sep 2006
Posts: 10,720
Send a message via AIM to celegorm
Default Re: Binary

Outside of the lower-level classroom activities binary wasn't used much in any of my networking classes in school. It was just used to show how networks determine where packets should go and what is internal traffic vs. external.

The same thing applies to programmers. We don't program anything in binary (except on rare occasion) but everything gets turned into binary by the computer running it.
__________________

__________________
"as a fanboy i refuse to admit it and will pull countless things out of my butt to disprove it"

Team Thelegorm! Total Kills: 21 (i iz in uor profile, editsing your sigz)
celegorm is offline   Reply With Quote
Old 02-18-2015, 10:37 AM   #3
Daemon Poster
 
Join Date: Nov 2014
Location: UK
Posts: 627
Default Re: Binary

Hello and welcome to the forum. You have asked a very very involved and complex question. In short yes binary was and is used to program computers. Binary is a numerical system based on two states 0 and 1. We normally work in denary which is a numerical system based on 10 digits ie 0 to 9. We use that system because we have ten digits or fingers. Computer programers because every piece of electronics that enable a computer to run is based on a system if switches and switches only have two states ie on or off. The on state is represented by a 1 and the off state is represented by a 0. Whilst programing in 1s and 0s can be and is used it can be very time consuming and takes all of your mental skills to bring about a result so programming languages were developed such as C, C+, CC+, Fortran etc. There are lots of different progarmming languages and these languages are specific to a programming task, They try to use a logical command rather than a series of 1s and 0s to perform a certain task within a program. But the bottom line is that all programming languages can be broken down to the base language used which is binary.

I'm afraid that that is a very simplistic explanation and really if you want more knowledge and experience in this sort of stuff you really need to study computer science. I am by no means an expert on these things but I was brought up using personal computers such as the Sinclaire ZX80 which had to be programmed in binary code to perform very simple tasks such as drawing a mono coloured circle on the screen. That bit of binary coding actually took up a large part of the page in the Sinclaire magazine that attempted to teach us computer sprogs how to program. I never did learn how to do it and eventually just went down the road of building, using and repairing computers and letting others do the programming.
__________________
pete.i is offline   Reply With Quote
Old 02-18-2015, 11:54 AM   #4
Fully Optimized
 
strollin's Avatar
 
Join Date: Mar 2011
Location: USA
Posts: 2,631
Default Re: Binary

For a direct answer to your question, no, binary is NOT a programming language. As Pete.i said, binary is a number system and the only thing that a computer actually works with.

Languages are what humans understand. The purpose of a programming language is to translate what humans understand into binary so a computer can process it.

Today's Dilbert comic strip seems appropriate here:
dt150218.gif
__________________
strollin is offline   Reply With Quote
Old 02-18-2015, 12:38 PM   #5
Daemon Poster
 
Join Date: Nov 2014
Location: UK
Posts: 627
Default Re: Binary

Here you go. Something to get your teeth into if you are really interested in this stuff. I hope you are because this is really where the money is to be made in computing. This is just a start.

How Bits and Bytes Work - HowStuffWorks
__________________
pete.i is offline   Reply With Quote
Old 02-18-2015, 12:47 PM   #6
Beta Member
 
Join Date: Feb 2015
Location: US
Posts: 5
Default Re: Binary

wow thanks everyone for all the informative replies. I'm currently attending a community college (looking to transfer to a university after i graduate) for IT and am generally curious about programming. I was initially intimidated by it because I'm not great with advanced math but I'm really eager to learn all that i can about programming and syntax and all that good stuff. I just think it's cool.
__________________
Mysterionymous is offline   Reply With Quote
Old 02-19-2015, 09:54 AM   #7
..m.0,0.m..
Site Team
 
iPwn's Avatar
 
Join Date: May 2010
Location: USA
Posts: 3,926
Default Re: Binary



Quote:
Originally Posted by celegorm View Post
Outside of the lower-level classroom activities binary wasn't used much in any of my networking classes in school.
...Really?

Never got into subnet masking using CIDR notations and figuring out how many hosts vs. networks you have??

Not saying I know more than you... but I remember doing that crap because I hated it!! Then again, with the advent of tools like SolarWind's subnet calculator... guess that math gets lost.
__________________
Me: You'd think as the dominant species we wouldn't be so effing stupid.
J: We're just intelligent enough to be completely effing stupid.
iPwn is offline   Reply With Quote
Old 02-19-2015, 12:19 PM   #8
Site Team
 
celegorm's Avatar
 
Join Date: Sep 2006
Posts: 10,720
Send a message via AIM to celegorm
Default Re: Binary

Lol that was my lower-level class...
__________________
"as a fanboy i refuse to admit it and will pull countless things out of my butt to disprove it"

Team Thelegorm! Total Kills: 21 (i iz in uor profile, editsing your sigz)
celegorm is offline   Reply With Quote
Old 02-20-2015, 12:35 PM   #9
..m.0,0.m..
Site Team
 
iPwn's Avatar
 
Join Date: May 2010
Location: USA
Posts: 3,926
Default Re: Binary

...? Really? Network Engineer?
__________________
Me: You'd think as the dominant species we wouldn't be so effing stupid.
J: We're just intelligent enough to be completely effing stupid.
iPwn is offline   Reply With Quote
Old 02-20-2015, 02:16 PM   #10
Site Team
 
celegorm's Avatar
 
Join Date: Sep 2006
Posts: 10,720
Send a message via AIM to celegorm
Default Re: Binary

No, Software engineer with a focus on networking. After that class where we programmed the physical controllers (which needed the binary and knowledge on how it was used to calculate subnets and such) we worked making our own porotocols in the upper-level classes using our controllers from the lower-level
__________________
"as a fanboy i refuse to admit it and will pull countless things out of my butt to disprove it"

Team Thelegorm! Total Kills: 21 (i iz in uor profile, editsing your sigz)
celegorm is offline   Reply With Quote
Old 02-25-2015, 06:00 AM   #11
Site Team
 
root's Avatar
 
Join Date: Mar 2004
Posts: 8,107
Default Re: Binary

might be worth pointing out that, while binary is not a programming language, it's not unheard of to program in binary.

it's unlikely that you ever will though.

generally if you look at programming it'll be from a point of view like this:
(when I say next step down I'm talking about getting closer to the silicon.)

Using a drag ans drop type program, (e.g Media builder) everything is basically done for you, it's like using actionscript, or using javascript, you put an element down, and you say, onclick, play this sound.
you don't need to worry about how files are actually loaded, or how networking works, or anything too in depth.

Down from this you'll get languages like visual basic, again very drag and drop, you don't need to learn how to draw a window on the screen, but you get a bit more power over what things can and can't do. you;re also a bit more involved with creating things, - so rather than having a file selection box as a pre-built thing ready to use, you put the code in to make that.

Down from that you start getting to languages like Java, you need to tell the program where everything goes, and there aren't many graphical (drag drop type) interfaces, you're writing in code, but, even though this sounds like really hard work, the computer and compiler are still doing a ton of work for you, you don't need to worry about a lot of memory management or anything like that. there are still plenty of "pre-built" things... the trouble is. (and this might just be a personal experience) but most things written in Java seem to be resource hungry beasts. I guess it's cheaper to buy more memory than to figure out why your notepad application consumes 50MB of RAM, etc. C++ gives you (the programmer) a bit more to worry about than Java, but by and large, lots of things are done for you.

Then you get things like C, where the onus is very much on you to allocate and de-allocate memory. you have to write much more code. but "generally speaking in my experience" the results are slicker. (e,g using less resources). for all intent and purpose, unless you're working on OS Kernels or embedded hardware, you may as well consider C a dead language, there aren't that many programs written using it any more.

All these languages require compilers to move from a human readable form to machine codes.

After this there is another type of programming languages.

Often when people talk about Assembly languages they write them using mnemonics:

mov A,B
(move contents of register A to register B)
(there is a post on this here: http://www.computerforums.org/forums...lp-226255.html

when written like this there is still the need for a compiler (of sorts)

in this case though the compiler is a literal translation.
MOV = 0000 1010 (instruction 10)
A = 0000 0001 (register 1)
B = 0000 0010 (register 2)

so MOV A,B literally gets compiled to 0x0A, 0x01, 0x02

But (and here is the clever bit) when you read a programming manual for a chip those instructions (move) and register locations are given in hex.

so instead of writing, MOV A,B

you can refer to the manual and just directly write,
0x0A
0x01
0x02

or you could write that out in binary as

00001010
00000001
00000010

so the long and the short of it is:
Yes, you can program in binary, not only is that possible it used to be the ONLY way to program, -where programs were instructions were literally entered with a bank of 8 switches and a GO button. - you'd set the switches according to line1 of the program, then press GO. then you'd set the switches to line 2 and press GO, then set the switches to line 3 and press GO.

this is still possible with some very small chips, (though most chips load data via a serial bus so parallel programming is all but dead!)


Not quite binary, but programming a Z80 in machine code
0x0a, 0x01, 0x02 is (or at least was) on the A level (16 - 18 year old) learning syllabus for electronics (in the UK) I was given a question paper, and a programming manual, you had to hand write programs in hex under exam conditions.

Writing a program to create a signal generator in hex was a part of my 1st year electronics degree program...


Even though it's highly unlikely that you'll ever need to program directly in Hex or binary, (though people still do for small embedded controllers) that doesn't mean that it's not worth learning, its quite useful to be able to count in it (that and hex!) (so at least you understand where you need to use int, and small ints etc. many programs, (especially encryption based) will use binary operations to change data. (stuff like bit shifting and performing XOR operations etc.)


so, to answer your question:
Is binary a programming language, - no.
Do programmers use it - Yes.

Weirdly, even though binary is not a programming language, you can program in binary. (where program means to physically sit at a box of switches entering code to a device!) it's not the same as programming in java, or C etc.



My advice. if you are hazy on the concepts of numbering in computer systems, then try and get as much committed to memory as you can, (that way the grass roots type stuff you know, and can devote more time to understanding the more advanced stuff when required).
__________________
I didn’t fight my way to the top of the food chain to be a vegetarian…
Im sick of people saying 'dont waste paper'. If trees wanted to live, they'd all carry guns.
"The inherent vice of capitalism is the unequal sharing of blessings; The inherent vice of socialism is the equal sharing of miseries."
root is offline   Reply With Quote
Old 02-25-2015, 09:59 AM   #12
Fully Optimized
 
strollin's Avatar
 
Join Date: Mar 2011
Location: USA
Posts: 2,631
Default Re: Binary

Yes Root, I did some programming of an HP 3000 mini computer in a college class back in 1976 or so using only the front panel switches. However, the front panel switches represented Octal, not Binary, so we had to convert the instructions into Octal in order to set the switches. Once the program was completely entered then you had to hit the "Run" button. The front panel lights would flicker for a few seconds and then stop, if the lights then showed a result of 0 (return code of zero) then the program ran successfully, otherwise a non-zero return code meant your program failed.

After that class, I couldn't understand why people were so awestruck by computers since I thought it was way too much work to set the switches for each instruction and then run the program only to attempt to get a return code of zero. It was another 8 years before I used a PC with a keyboard for input and a monitor for output where I could finally see the value of using a computer. In between, I took a programming class where we used punched cards for input, another method that I felt was way too complicated to be worthwhile.
__________________
strollin is offline   Reply With Quote
Old 02-25-2015, 11:24 AM   #13
Site Team
 
root's Avatar
 
Join Date: Mar 2004
Posts: 8,107
Default Re: Binary

That's not really too different from using hex as an intermediary step.

in hex numbers 0,1,2, 3, 4, 5, 6, 7, 8, 9,a, b, c, d, e, f represent numbers 0-15 (or 1 - 16 depending on where you start,) or a four bit binary number.
in octal, number symbols 0,1,2,3,4,5,6,7 represent numbers 0 - 7, (or three binary bits).

it's still sort of programming in binary, in so far as you can "see" electrical connections being made. - and yes, a hell of a journey to see a light blink! or to add two small numbers together that you could do in your head.

I guess that the answer is still the same, (you can do something, but ordinary people would choose not to!)...

HP3000 is a little before my time!!
__________________
I didn’t fight my way to the top of the food chain to be a vegetarian…
Im sick of people saying 'dont waste paper'. If trees wanted to live, they'd all carry guns.
"The inherent vice of capitalism is the unequal sharing of blessings; The inherent vice of socialism is the equal sharing of miseries."
root is offline   Reply With Quote
Old 02-28-2015, 06:51 PM   #14
Site Team
 
berry120's Avatar
 
Join Date: Jul 2009
Location: England, UK
Posts: 3,434
Default Re: Binary

One thing I haven't seen mentioned (which is actually where I've had to drop down to raw 1's and 0's most) is reverse engineering of comms protocols (particularly serial protocols.)

I've spent the best part of the last week in my day job reverse engineering an IR protocol used on laser tag guns... turned out to be a form of RS232 over IR, but with the start and stop bits backwards (don't get me started on the stupidity of that last bit!) That required an oscilloscope, logic analyser and many hours of staring at highs and lows on the scope (and translating them into 1's and 0's accordingly) before we figured out what was going on.

Now that's figured out we're down to analysing the protocol at a higher packet based level, and the same applies - you still have to realistically work with the data at either a binary or a hex level to work out what bits are changing to what based on different parameters.

Before that I was doing a similar task on a circuit board designed to drive ultrasonic rangefinders - same story.

Before that I was doing the same thing on an atomic clock receiver with a UART (drivers were windows 3.1 only, not very useful today but the receiver itself works great!) Again, same story with dropping back to raw binary to work out the protocol.
__________________

__________________
Save the whales, feed the hungry, free the mallocs.
berry120 is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off



All times are GMT -5. The time now is 12:34 AM.


Powered by vBulletin® Version 3.8.8 Beta 4
Copyright ©2000 - 2020, vBulletin Solutions, Inc.
Search Engine Friendly URLs by vBSEO 3.6.0
×