First let me say, please don't discredit my claims until I can provide something to support them (then whether I can uphold them will be evident). On the other hand, you have many strong points and I will do my best to answer them. I do understand your motive to be realistic though, so it's my job to either do what I've claimed or accept failure -- and let's leave it at that and let the result stand for itself when that time comes
There is no difference between C structs and C++ classes as for storage of data, but there are major differences in behavior. For example, when a class object declared in C++ "object o;" (assuming "object" is some class), a constructor is called, and (depending on how complex the class is) this simple declaration could generate all kinds of instances and copies just to get things set up. In C, such a declaration is JUST a declaration. That is one thing I meant by hidden & automatic things. Similar things happen when a local var is declared in a function; just try assigning a pointer to a temporary object that automatically destructs when the function closes -- you get a mess. As in C, this will only construct or destruct objects when told to explicitly (to maintain "total control"). Now if I were reading this and not knowing anything about the project, I would say "Dude, if you have to do all that manually, how is it cleaner or simpler than C++?" In all honesty, yes if you are trying to do something complex the code will be more complex; but by cleaner I meant does-what-you-say as opposed to the slew of errors that can come from declaring the wrong thing in the wrong place at the wrong time in C++; and by simpler, I mean that simple statements like "I just need this object and to do that with it" do just that and nothing more complicated than stated (basically the same thing as "cleaner"). I guess we are talking "simpler" and "cleaner" in the sense that assembly would be. To that I would respond, "But dude, trying to do complex OOP stuff manually is NOT simpler or cleaner" -- well think of it this way: Some simple OOP program in C++ could easily involve all the kind of stuff I just talked about (unpredictable number of copies and references made, etc), but basically the same kind of program can be made in assembly WITHOUT all that complex OOP; probably gone about slightly differently, but often the clusters of data are roughly the same. I intend to still use OOP just to make handling such clusters look nicer and have the OOP feel. I feel just this paragraph alone could start an endless debate... but hopefully you see my point, and I do recognize the drawbacks (and you can see).
As for most of the features of OOP without the kind of overhead that makes it incomparable to assembly, that DOES need come clarification: Indeed, any OOP that goes to the extent of allowing polymorphism is definitely going to be less efficient than straight-up assembly. By "comparable" I mean that, if you were to do the equivalent thing in assembly (i.e. instead of using dynamically inherited classes, using function pointers directly when a code segment is coded to work with whichever "behavior" given it), it actually is not that far off! In order to fulfill such a bold claim, restrictions do have to be made: First off, ONLY single inheritance is allowed. I have devised a way to use the Java paradigm of interfaces as well, but that DOES require a bit more overhead. If one were to make code that required a reference to some function and some object to pass to it though, it would be very comparable. In instances where more manual techniques are more efficient, then it's roughly the same as saying that it's wasteful to bother using interfaces etc for the given task -- which is true in general anyway. So "comparable" in that aspect (needlessly complicating one's own program is another thing), but you are definitely right that polymorphism will never be AS efficient as going about a task another way (where that is an option), and nothing keeps the programming from going about it another way
"Nothing built into the language with preconceived ideas..." -- Yes that's correct in many cases; but there are many languages where that does not apply (TI-BASIC, Axe Parser, and BBC Basic all are either geared to work directly with the TIOS and no other way, or come with a who set of built-in tools to "give ya everything you should need" for it to be extensively usable). As of yet, No other language for the z80 TI-Calcs (aside from assembly) compares to all-out computer languages like C or C++; I am comparing to what's already out there for them as alternatives to assembly or BASIC (specifics already named). However, don't the "new" and "delete" commands (and other similar mechanisms) of C++ assume there is some sort of OS to handle memory allocation and that it does it a certain way? C uses malloc, which is an explicity coded function with an explicitly decided way to manage memory; and though it's a language standard, it's not "built in" (i.e. a keyword), and alternatives could be coded. That is the kind of environment I mean to provide.
"No limitations on what can be integrated..." - Yeah, the mostly just means because you can inline assembly code. But to be more specific, I'm designing it such that access to system addresses and routines (on the assembly level; i.e. made in any language) can be tied to variables and functions of my language either directly or with very minimal modifications. Some instances will of course require explicit assembly commands to land things in specific registers, since there are many paradigms of how to "pass values" and there is no one "correct" way that will always work for anything and everything coded in any environment.
"Able to do anything in assembly..." -- I suppose that's exactly what I said above, just worded differently. I admit I was pretty tired and just throwing out whatever I could to describe it, hence the repeat
As far as mixing compiled and interpreted variables, that is a HUGE topic of debate for the language. Heck yes, I've had one heck of a time coming up with a good syntax and conventions, and I admit I don't have every last possible kink ironed out just yet ... but yeah I understand why that sounds like more trouble than it's worth. Quite honestly, most of the time it would not be needed, and many of the things I originally thought it would improve can be improved anyway by using data-flow analysis to coalesce variables and things like that. Mostly it is a convenience feature to be used as an alternative to explicitly hard-coding, say, 100 values that follow a predictable pattern into an array. But the main idea was to create "interpreted functions" that basically act as macros, but can act intelligently about the data given to them. As I said, mostly just a nice feature and an added way to hide nasty assembly statements without having to actually call a function at runtime to use it. Admittedly though, overuse of this feature can create messy and confusing code.
"The output code will be more suited to the z80 than C or C++" -- Yeah that's a bold claim. I cannot do better than either as far as larger computers are concerned, but on the z80 for a TI-calc ... I retract that C cannot do well on it, I'm sure it could (with a ton of tweaking and all kinds of tools made for it), but my biggest win there is essentially restrictions on the primitive types being only 8 or 16 bits, and the way values are passed (without assuming that an OS provides a heap and using stack frames ... bleh!). I know C can pass directly to registers if needed and perhaps NOT use a stack frame to save values in recursive instances, but that requires the programming to know all about that and do it explicitly -- I intend to have such optimizations be a given. I am comparing what a "typical" programmer of a language would do (pass values the normal way using stack frames and wastefully use the "int" type...)
Not all, but much of what I've discussed is in my blog (that's what it is; a place for me to work out all the behind-the-code issues). I am still working on the topic about OOP, and the topic on interpreted entities ... yes that's a mess, I will not deny that, and it takes up 3 posts currently; but that's including a full history of the kinds of ideas I had for it and how it's changed and other particulars I've worked out. If you care to read that but only care about what's "current", skip the "An okay solution" section and down to "A better solution" on the intercompretilation topic.
Instead of stating that "such a thing will never work" (Not sure if you are exactly even doing that, but that picture is getting painted to readers of your post), Is there anyone who is interested in discussing how TO make something like this work? I really feel I have some strong arguments and examples already in place and that I am very close already (see blog). I don't think you are trying to be mean and knock my proposal, you have valid points; I am just trying to explain why (and where/how) I do as well. Thanks.
blog:
http://dancookplusplus.blogspot.com
My apologies, I only saw Kllrnohj's post when I replied for some reason.
@Kllrnohj: I hope you don't think I'm trying to argue, I do really appreciate your questions and comments; especially since you seem to know a lot about this kind of thing. I did kinda laugh to myself because, if I recall, you've been a strong critic on my past project (Antidisassemblage) in the past (I DID start before I knew what I was doing and released it before it was close to working properly).
@KermMartian: Well said; that is exactly the viewpoint I expect, and I hope to deliver well to any skepticism or optimism (both are great to have)
@quigibo: Hey, thanks! I already told you how impressed I was with Axe and the beauty of its minimalistic nature (using the Basic tokens as source code), so it's great to have someone with similar a experience/project give good feedback. There still are plenty of kinks to hammer out before I get anything released. Honestly, between work and school and pouring over my own theory, I do not expect anything close to a finished product to be released any time soon. The language (as defined by the compiler) will be utterly minimalistic, meaning that not so much as a "print" function will be part of the language. Of course, that is where libraries and standard files come in. That will be another project all its own (or phase 2 of the current one, I suppose). I want this to be as adaptable to z80 as C is to [other systems], but there DEFINITELY will need to be libraries/includes made for this. Basic I/O, Sprites/Graphics, data manipulation, ... Yeah anything can be made for it, but as far as what will become "standard", that I will cross that bridge when I come to it; and hopefully not alone. It is possible that multiple flavors of "standard" will develop (analogous to Linux -- not that this is anything CLOSE to something like Linux). I also forsee libraries being made for tools that already exist. There could be libraries just for Mirage, or just for Doors, or just for Bob's-Amazing-thingamajig-toolkit. In other words, with the right tools, this can be used to code any subset of z80 programs normally coded in assembly.
...Oh and KermMartian, if you could make a counterpoint, that would be amazing. Ever since I've had this project (or the past one), the only kind of feedback I have ever gotten was "That sounds great and amazing; but it's over my head and I cannot talk theory with you", or "That will never work because [your claims are bogus] / [I have solid reasoning against it (and am not open to the possibility of that being fixed/improved to an acceptable level to meet your claims/vision)]". The only people apt to grasp all my theory in the past have either been uninterested or dismissive or in disbelief. I've not been able to establish much credibility in the community yet, so I am really hoping I can find that here (and recognize that that may not come until I have something solid to deliver). Thanks