• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Wanna dabble in basic programing

VB,C++,assembly
BASIC isn't used anymore.
I think he means he wants to do some basic (simple) programming, not necessarily using a form of BASIC.
VB is a terrible idea, C# is the most worth-while CLR language to learn as its grammar and syntax are almost identical to most languages that end statements with semi-colons.

For learning how to program, starting with a scripting language like Python (as others have suggested,) or with Ruby might be a better starting point. Languages like Java, C#, and C/C++ require a deeper understanding of how the computer works since you're starting to get closer to hardware. Languages like Ruby and Python will get you into good habits without making everything too difficult. PHP theoretically could be lumped together with Python and Ruby, but generally speaking it's a bad language to start with since it lets you do some very stupid things.
 
I think he means he wants to do some basic (simple) programming, not necessarily using a form of BASIC.
VB is a terrible idea, C# is the most worth-while CLR language to learn as its grammar and syntax are almost identical to most languages that end statements with semi-colons.

For learning how to program, starting with a scripting language like Python (as others have suggested,) or with Ruby might be a better starting point. Languages like Java, C#, and C/C++ require a deeper understanding of how the computer works since you're starting to get closer to hardware. Languages like Ruby and Python will get you into good habits without making everything too difficult. PHP theoretically could be lumped together with Python and Ruby, but generally speaking it's a bad language to start with since it lets you do some very stupid things.
You're correct if the thread starter want to:
- learn basic programming with intention as a start line for learning even deeper or diverse
- learn good habit of programming
- have a lot of time to spare

If the goal is just one or two simple projects, not intended for full-on profession, I think it's better to learn those 'stupid' languages (VB or PHP) instead due to:
- shorter learning curve (it's kinda the easiest one)
- more forgiving - which stupid from another point of view but I'd argue he isn't doing extremely complex programming, so the debugging/error will be minimal.
- be productive in short time may encourage him to do more
 
  • Like
Reactions: xvi
If the goal is just one or two simple projects, not intended for full-on profession, I think it's better to learn those 'stupid' languages (VB or PHP) instead due to:
- shorter learning curve (it's kinda the easiest one)
- more forgiving - which stupid from another point of view but I'd argue he isn't doing extremely complex programming, so the debugging/error will be minimal.
- be productive in short time may encourage him to do more
Yes, but long term it could be demotivational because PHP doesn't enforce good programming practices. As a new dev, you might write a bunch of code but not know what's going on because PHP lets you do some pretty funky things. I can only recommend PHP if the web is where you plan on dev'ing and even then, PHP isn't an ending point. I have a special place in my heart for PHP. I love it because you can pump stuff out quick. I hate it because it easily can become a mess to manage.

I think VB is a better option than PHP because it's strongly typed and requires explicit type casts unlike PHP which is loosely typed. So in a CLR language, 1 + "1" would through an error, whereas in PHP: 1 + "1" would = 2 and is legitimate. Things like that make you consider the argument a little more where in PHP, you might not even stop to think if something is a string, an integer, or a float.

Now, if someone is interested in Web, I think Ruby is one of the best language to start with. Pair that with something like Sinatra, and you have a very simple web server that is very easy to understand and very easy to start with.

http://www.sinatrarb.com/

Edit: I agree with all your first points though. I do think anyone first starting is going to have a lot to learn, so time is a necessity.
 
Last edited:
  • Like
Reactions: xvi
So in a CLR language, 1 + "1" would through an error, whereas in PHP: 1 + "1" would = 2 and is legitimate.
You're wrong there:
Code:
Module Module1
    Sub Main()
        Console.WriteLine(1 + "1") ' 2
        Console.ReadKey()
    End Sub
End Module
.NET automatically tries to convert "1" to Double and it succeeds. Now, if it was "a1" making that conversion invalid, it would fail at runtime.
 
You're wrong there:
Code:
Module Module1
    Sub Main()
        Console.WriteLine(1 + "1") ' 2
        Console.ReadKey()
    End Sub
End Module
.NET automatically tries to convert "1" to Double and it succeeds. Now, if it was "a1" making that conversion invalid, it would fail at runtime.

I stand corrected. It's strictly typed by it does type casts for you. Was that still the case 6 years ago? It's been a while since I've really written any C# or VB.

Doing something like this in Ruby or Python will actually type error because an explicit cast is required by the language. PHP is even more weird because the order can determine the cast, making addition not commutative between distinct types, JS is similar to PHP in that respect as well.
 
Last edited:
.NET automatically tries to convert "1" to Double and it succeeds. Now, if it was "a1" making that conversion invalid, it would fail at runtime.

Why would anyone find that useful? "1" should never equal 1.
 
Oooonnneeeee!!! Is the loneliest number....
 
I stand corrected. It's strictly typed by it does type casts for you. Was that still the case 6 years ago? It's been a while since I've really written any C# or VB.
In C#, + is concatenation when there is a string involved; + is additive when there is numeric types involved so...
Code:
namespace ConsoleApplication1
{
    class Program
    {
        static void Main(string[] args)
        {
            Console.WriteLine(1 + "1"); // 11
            Console.ReadKey();
        }
    }
}
There is no way to make 1 + "1" = 2 in C# without explicitly converting "1" to a numeric type.

VB 1 & "1" == C# 1 + "1"
VB 1 + "1" == C# 1 + Convert.ToInt32("1")

One could create a C# overloads so that integer + string attempts to add like VB but not much point in doing that.
 
In C#, 1+"1" would be 11, a integer and a string are not added, are concatenated. + sign represents concatenations, the integer is converted to a string.
 
Some real advice ignoring what everyone else has said is to find something you are actually interested in working on, choose the language that fits the best features that meet the project that you want to work on's needs. Your goal shouldn't be to learn programming, your goal should be to complete a programming project that you would find enjoyable to work on everything else will come in due time.
 
Back
Top