scripts



Well, "scripting languages" can be compiled.
And "compiled languages" can have interpreters written for them.

Destroyed...

The distinction comes more from the purpose of the language. C was designed as a higher-level alternative to ASM and be far more structured and featured than, say, BASIC. It's compiled, but, a lot of C syntax/semantics can be used by the interpreter known as csh (c-shell -- yes they did that purposefully -- and just like bash, sh, zsh, it's just another shell). One could write a full on C interpreter (thus making C a scripting language). By the same token, someone wrote a compiler for Python (thus making Python a compiled language). Only C is the compiled language and Python is the scripting language... It's subjective? No, it's the thing's primary/intended use.

Now "Programming Language"...

Bottom line is any "language" used to send instructions to a PC (or logic gate "thing") that includes logic-based execution is a programming language. By merit of the fact that what you are doing is... ... ... programming. Eeyup.

The differentiation is a symantic one. Programming is essentially using logic. Now I don't know the full history, but "scripting" probably originated from listing "stuff to do" without any logic/conditions. Then it likely got corrupted, translated to Klingon, translated back to English and ended up confusing everyone.

TL;DR: it is a programming language if it is a language -- no matter how abstract -- which uses condition/logic based execution. And the type of "scripting" we're talking about by modern standards using stuff like Python, Bash, Basic, JS, etc. is technically programming.

(And they all lived happily ever after)

- J
 
Last edited:
Fanboi wrote:
Everything is just that, a flow of logic. Even our spoken languages. In fact programming existed before computers. In an abstract by Ada Loveless, in mathematics, and even in our own biology... and psychology. Everything is a set of logical instructions and how we interact with them is a mere interface (oh dear, time for the "causality" debate).
Everything? Colourless green ideas sleep furiously. And dreams ... the wonderful world of sleep :)
Here's something to think about too: https://theconversation.com/why-we-...ains-and-why-the-metaphor-is-all-wrong-185705. If only logic prevailed and our brains were computers ...
 
Scripting languages are programming languages...but programming languages are not necessarily scripting languages.
Hmmm ... not sure on this one, but then I'm not an expert, so you're probably right. I wonder if @JasKinasis could offer some clarifying.

I think everybody else has this pretty much covered already. This is quite a sprawling topic, but I’ll try to be brief.

It’s true, all scripting languages are programming languages. But not all programming languages are scripting languages.

You have compiled programming languages like C, C++, Fortran, COBOL, Pascal, Rust, Go - that compile to whatever the native binary, executable format is.
E.g. ELF in Linux, .exe in Windows.
These are programming languages, but are NOT scripting languages.

Then you have compiled languages like Java (and other JVM languages like Scala) that compile to byte-code that runs on the Java Runtime Environment, or in Windows you have C# and VB which compile to byte-code that runs in the .NET runtime. There are a few other languages that compile to an intermediate format that runs on a separate run-time environment.
Again, not scripting languages.

And then there are languages like Lisp and Haskell that can be be ran interactively in an interpreter, or compiled to a native, binary format and ran as a native executable.

Compiled programming languages are not scripting languages. In all cases, with compiled languages - You need to successfully build/compile the program before you can run it. If there are any syntax errors in the code, the build will fail at either the compilation stage, or at the linking stage and will report any errors/warnings. And you will have to fix any problems in the code and rebuild/recompile the program until it builds successfully.

You can only run the program once it successfully builds. And even then, you need to test, debug and profile the code extensively, in case there are any other bugs/problems in the code that manifest themselves at runtime.

So in C/C++, you need to ensure there are no memory leaks, or buffer overruns because memory management is left to the programmer! And in all languages, you need to ensure that the logic is completely sound and can gracefully deal with any unexpected, or exceptional circumstances. And ensure that all user-input is properly parsed, sanitised and validated/range-checked before being used.

User input should never be trusted because users are usually either:
1. Really stupid
Or
2. Really clever and/or creative/destructive/evil.

The last thing you want is a user entering a malicious SQL statement in a text field and end up with your production database being corrupted/destroyed.

So anything input by the user should always be untrusted and thoroughly checked/validated!

Most of the time, with compiled programming languages, the most severe problems are caught during the build process, either at compilation, or at the linking step.

Runtime bugs are usually due to faulty logic, or bad practices from the programmer/s. The number and severity of runtime bugs will depend on the skill/ability levels of the programmer/s working on the software. Programmers are not infallible, they are human and make mistakes, just like everybody else.
So bugs are inevitable in software.

Then you have scripting languages like Bash, Perl, Lua, Python, Ruby, JavaScript, PHP etc. Many of these scripts are ran directly in an interpreter, others (like Python) are compiled to byte-code (behind the scenes) before being ran through the interpreter.

Unless you have a syntax-checking plugin for your chosen text editor, or IDE (Integrated Development Environment) - syntax errors in scripts are not usually caught until you actually try running the script through the interpreter.

So with scripts, you usually write your script and keep running it and fixing any syntax errors along the way. Once you’ve got your script running without getting any syntax errors, you need to thoroughly test and debug, to fix any runtime bugs that might manifest themselves. Which again are always due to poor/faulty logic used by the programmer.

So again, you need to sanitise, validate and range-check all user input and check that all of your logic is completely bullet-proof/idiot-proof/evil genius-proof!

Many programming and scripting languages allow you to write test code, that you can use test the integrity of different modules in your code. So you can perform test-driven development, which is another way of detecting when errors have been introduced into your codebase.

Also, HTML is NOT a programming language, it’s a mark-up language, used to describe the contents of a web page.
CSS is kind of an extension to html - allowing you to set styles for page elements. So again, not really a programming language per-se.

But scripts written in programming languages like JavaScript, PHP, or Python and Ruby (via frameworks like django/flask and Ruby on the rails) etc can be used to provide client-side functionality to web-pages, making the user experience more interactive/immersive and/or allow users to log into sites and interact with server-side services / micro services, load/save personal preferences etc.

In answer to the OPs question- Programs are written in various different programming languages - dependent on the needs of the project and the expertise of the person/team behind each piece of software.

Editing any program requires modification to some kind of source code - which could be in any of a number of programming languages.

Some software is compiled to a native binary format, or to byte-code that runs in a runtime environment. Others are just collections of scripts.

When it comes to choosing a programming language for a project, there are a number of different things to consider. From the languages that are best suited to the type of application you’re developing, or the platforms you will be targeting. To the technical requirements of the project. To the programming languages that you, or your team are most familiar/proficient with.

Some developers use one or two languages, all of the time, for all projects/products. Others are more pragmatic and will choose different languages, based on the aims/goals/technical specifications/requirements of each project.

If speed/performance is key, then C or assembly are your best bet because they are the most low level languages. They interact with the hardware more directly and have less in terms of overhead.

But working at such a low level requires careful coding because there is much less type safety and the programmer is left to deal with all memory management. And again sanitising all user input goes without saying.

For more type-safety and more automated memory management - you can use higher level compiled languages. Higher level compiled languages tend to perform a tiny bit slower, because of the extra overhead used to instantiate the libraries of convenient abstractions they provide for the programmer. And the more abstract the language is, typically the more overhead is required.

And any interpreted scripting languages (Python, Ruby etc), or compiled languages that run on a runtime (like Java, C#, VB) tend to incur a bit more overhead. Again, because they require extra resources to instantiate the interpreter/runtime and load any additional scripts/frameworks/resources required by the script/program, before it can start running the actual script/program itself.

But with the processing speed and power of modern PC’s these kinds of factors are less of an issue for typical desktop applications. So for many applications, the choice of programming language isn’t that critical.
 
...but I’ll try to be brief.
And then, I kept reading... and reading... and reading. ;)

Jason, you are a treasure, and I'm just poking a little fun. That was an excellent reply, and I thoroughly enjoyed every bit of it. I'm not on here very much anymore, but this was a good moment to jump in and say "Thanks!" to you for all the help you've given me (and others) here. You are very much appreciated! :cool:

When you find the time to write "the long version"... I'm quite sure O'Reilly would publish it, and I would be happy to purchase it.

Cheers
 
I still think HTML is a programming language since it's used for programming web pages...but Semantics games only get us so far...
 
Fanboi wrote:

Everything? Colourless green ideas sleep furiously. And dreams ... the wonderful world of sleep :)
Here's something to think about too: https://theconversation.com/why-we-...ains-and-why-the-metaphor-is-all-wrong-185705. If only logic prevailed and our brains were computers ...
I think you missed the context of what I said (also, PKD ref noted):
Fanboi said:
Objectively, this is just another interface. All methods of programming are simply a means of interfacing with a computer and they all amount to a simple thing: logic flow. Everything is just that, a flow of logic. Even our spoken languages. In fact programming existed before computers. In an abstract by Ada Loveless, in mathematics, and even in our own biology... and psychology. Everything is a set of logical instructions and how we interact with them is a mere interface (oh dear, time for the "causality" debate).
I meant "everything" relating to programming, initially. I then went on to apply it to the basic construction of our human make-up. We are programmed biologically. That affects us psychologically. Disagreements within APA (who are not the final authority anyway) about merging psychopathy and sociopathy (bad idea, can't get into it here, forum rules), psychopathy is rooted is biology, sociopathy much less so (opponents are trying to argue weak fringe cases -- I use the "11 fingers" rebuttal). But biology plays a role in depression, too. In fact, the more we research (where recent western politics doesn't hinder things), the more we're understanding how much of human behaviours are influenced by our endocrine systems, how genetics affect (directly and inderectly) personality even in the face of environmental indoctrine. That's not to say we're all just logic machines, it's to say we have base programming that affects us (which is important to understand these things so we can better ourselves and find happiness/contentment/stability). I don't wanna threadjack further, so I'll leave this part of the talk at this, though there's a big meta-discussion I'd have about this if there was a General Chit Chat thread/sub. I will move now to the note on AI "sentience". Being self-aware requires more than just being cognitive as this can be observed in animals, including rats, certain dogs, elephants, and sea mammals -- to name but a few. To keep it short and to the point, self-awareness requires the ability to critically question one's existence -- realistically, metacognitively; to not just say one criticises one's existence because it makes logical sense to as a means to prove one's self-awareness (the ends). Often the phrase [/i]cogito ergo sum[/i] is haphazardly tossed about as a trend. It sounds flashly/clever, but what is "thinking" or, specifically, "being cognitive"? Can we really rule the line between passing random thoughts caused as an organic process and our own, "real" ones? Can we rule the line betwern logically emulating self-awareness as a solution to a problem, the end goal being "thing" is real? We cannot objectively. So if everything is subjective, that puts us back at design instead of free will. Here is where the more fitting dubito ergo cogito, ergo sum enters. So, yes, the most accurate way to prove self-awareness is through questioning it (doubt), specifically questioning the legitimacy of your questioning of yourself. Now, granted, we're entering the "prove your existence" weeds, remember that the most apropriate response to "How do I know I'm real?" is simply "Who wants to know?" So when this news broke, I had the same reaction as I do to all the FUD of AI: "Meh, at least The Matrix had cool action scenes."

And that said, I think that's enough for this off-topic. I will take ages to reply, but as I often say, "DM (PM) is always open to anyone, I just sometimes take ages to reply" (I'd love to have the time to run a discord server for this stuff).

Back on topic...
 
Also, HTML is NOT a programming language, it’s a mark-up language, used to describe the contents of a web page.
CSS is kind of an extension to html - allowing you to set styles for page elements. So again, not really a programming language per-se.
Thank you, thank you, thank you, thank you... you get the point. I grind my teeth when adverts to "Learn to Program" come up offering HTML. It's a pedantic irk of mine, lol.
 
Thank you, thank you, thank you, thank you... you get the point. I grind my teeth when adverts to "Learn to Program" come up offering HTML. It's a pedantic irk of mine, lol.
Plain, old-fashioned HTML isn't programming. But a lot of people associate embedded languages like ASP and PHP and JavaScript with the wrapper name HTML. Not a whole lot of professional static HTML websites out there these days.
 
Plain, old-fashioned HTML isn't programming. But a lot of people associate embedded languages like ASP and PHP and JavaScript with the wrapper name HTML. Not a whole lot of professional static HTML websites out there these days.
Fair point. Doesn't make them right, but fair point. Like how people call the monster "Frankenstein" -- another grinding irk of mine, lol. I used to be one of those "it's not 'Linux', it's GNU/Linux" people when talking about GNU-based distros. Professional pedant IOW.
 
Thanks Fanboi for your response in post #28. I appreciate it. I do agree not to waylay the thread. The only brief point I would make, is that Descartes attempt to prove his existence with the famous "cogito ergo sum" which translates as "I think therefore I am," makes the logical error of assuming, or affirming the consequent. He assumes there is an "I" (the first "I" in the translation) to try and prove the existence of that very "I" (the second "I" in the translation), a fatal logical error. In other words, his affirmation of the second "I", is used to prove the first "I" ... very cheeky of him really.

Back on topic, I would point out that applying the term "programming" to brain processes is using a metaphor, which is merely illustrative or figurative or symbolic etc, not decisive from a scientific point of view.

And thanks too JasKinasis for that cracker post #25.
 
Last edited:

Members online


Top