[Info-vax] The (now lost) future of Alpha.

Arne Vajhøj arne at vajhoej.dk
Tue Aug 7 19:24:41 EDT 2018


On 8/7/2018 3:25 PM, Johnny Billquist wrote:
> On 2018-08-07 03:01, Arne Vajhøj wrote:
>> On 8/6/2018 5:53 AM, Johnny Billquist wrote:
>>> On 2018-08-06 01:49, Arne Vajhøj wrote:
>>>> On 8/5/2018 7:28 PM, Johnny Billquist wrote:
>>>>> On 2018-08-04 00:28, Arne Vajhøj wrote:
>>>>>> C, C++ or Ada still provide easy HW access and
>>>>>> good real time characteristics, so they will
>>>>>> not go away.
>>>>>
>>>>> I might strongly disagree with this one. With C++, you have no idea 
>>>>> what happens when you do something in the language, making it 
>>>>> horrible for figuring out any sort of real time characteristics.
>>>>>
>>>>> Think classes, inheritance (even multiple inheritance), exceptions 
>>>>> and so on. You create an objects and you have no idea how much code 
>>>>> is executed, or how much memory is required. C++ is in general very 
>>>>> much depending on lots of memory, and a very dynamic memory 
>>>>> management model, which is horrible if we talk embedded and 
>>>>> realtime stuff.
>>>>>
>>>>> (But yes, I know C++ is being used by some people for those exact 
>>>>> environments anyway. I can just feel sorry for when, for some 
>>>>> surprising and unknown reason their devices do not work as expected.)
>>>>
>>>> Compared to languages that use GC then C++ is pretty good in this
>>>> regard.
>>>>
>>>> :-)
>>>
>>> I would disagree. I'd say that is a dangerous thing to think or say, 
>>> and it sortof legitimizes the use of a bad language that have so much 
>>> uncontrolled things going on. Just because it don't use GC isn't 
>>> enough to make it predictable or good in this field. It isn't 
>>> necessarily any better just because it don't have one thing.
>>
>> Some innocent looking C++ code may end up executing a lot of code.
>>
>> But that should be in the 10 ns - 10 us range.
> 
> You think/hope. You have in fact no idea what might happen or be done.
> Or how it might change because some other code changes, which is not 
> obvious it even affects things.

It is assuming that the developers has not done something totally crazy.

But if that assumption is not valid then bad things could happen
even in C (serious misuse of #define).

Arne



More information about the Info-vax mailing list