Why objective c is bad




















This is a complicated question; but in short; I think the answer most likely lies in the age of the operating systems, and their roots. Windows is an old operating system; and one that is built by stacking hack upon hack going back all the way to Windows 3. NET, C.

Mac OS X; on the other hand, is a comparatively young operating system, and its new parts while still quite old, being inherited from the NeXT and whatnot are all based on Objective-C because, "Hey! Why not? As backwards compatibility was not among the list of priorities with Mac OS X The issue with Obj-C is that the power of the language comes mostly from the sizable frameworks, the generally high level of integration into the system, and so on.

It's almost impossible to get a good jive like that going without a clean break from backwards compatibility and, as such, it would never really stand a chance on any platform that didn't dare to do this. Apple, with a small at the time and devoted user base, dared do this, and struck gold. Microsoft is now trying, but are, in my humble opinion, failing. I was recently standing in a bookshop reading Masterminds of Programming where the creators of programming languages talk about their creations.

Objective-C is nothing but a thin layer a bit thicker with 2. Even the most basic object orientation is provided by the runtime library, which was proprietary for a long time.

Inertia is an important factor in language use. While there is some value for Objective-C outside of GUIs , and I think people would use the extensions were they imported to C , even in system code, there is little reason to choose it over alternatives, when little of your system is meant to work with it.

Many people are impressed by lots of features. And it's had more research and development poured into it Essentially, momentum.

Stack Overflow for Teams — Collaborate and share knowledge with a private group. Create a free Team What is Teams? Collectives on Stack Overflow. Learn more. Quote: "Constraint b is unfair, because as I said, as a superset to C, it shouldn't have to resemble SmallTalk - it should have to resemble C". Quote: "Constraint c is unfair too. In C, you can do member access from a struct object as such: "obj. It makes logical sense to call methods in the same way.

Quote: " doesn't look weird because it's not what we're used to - it's because the bits that it extends to C don't look anything like C ". This isn't just a subjective thing. So since you're so sure, I will show you how it's done: The class declarations are easy - find 'class' keyword - so everything between the curly braces is something that needs preprocessing.

The important part is method calls. So, lets say we have this: obj. Well start by tokenizing obviously, and work through the tokens, adding them to a parse tree.

So far we have this parse tree:. Posted: 20th Jun Edited at: 21st Jun Link. Joined: 7th Sep Posted: 22nd Jun Edited at: 22nd Jun Link.

I have no problem with Objective-C being what it is so long as Objective-C has no problem with my opinion of how ugly it is, which is very ugly , but it's a tad frustrating that Apple has kept it as its native development language. It would make any Apple platform a lot more approachable to develop native apps for. That being said, I'll be learning it anyway since Apple's obstinance has made it worth knowing.

You know what's scarier than Objective-C? XCode was scary for the first day or so but once you get used to it is O-K.

I agree though, it does have some really messed up problems in particular when the inspector opens the last build configuration rather than the current build configuration. Posted: 22nd Jun Link. DavidR, Objective-C has a very thin runtime engine built into the actual object code, and as far as I'm aware, it always has. How could they possibly do that using a preprocessor?

Still, guess it comes down to opinion. I think C mixed with SmallTalk is intrinsically ugly and was a bad choice to mix such different languages. Quote: "Objective-C has a very thin runtime engine built into the actual object code, and as far as I'm aware, it always has.

Quote: "Still, guess it comes down to opinion. I have discovered that by changing the file extensions from. Joined: 8th Oct That's what I'm doing too, it's quite a popular option that I wish Apple would make more of a standard.

Personally I really like XCode, but it's not like I have a point of reference - maybe it's easier for me to just accept the way it works, never used VS or anything like that to compare it with. It's honest, and avoids all that interface builder stuff.

Health, Ammo, and bacon and eggs! Posted: 23rd Jun Link. XCode is getting git support? Do you feel it makes up for the lack of GC? My own, albeit quite limited, experience of the community has been overwhelmingly positive. Sure, it may not be as big as Java or C community, but helpful nonetheless. There is a lot to improve, and as you mention bug reports and such interactions with Apple would be very welcome.

I really like Cocoa Touch, and I also think the documentation is really good. As far as Xcode goes, I wrote this under Xcode 3 which double sucked.

Xcode 4 is certainly better, but it still sucks. Dustin L. Objective-C is by no means meant as a nepenthe for memory management woes, and is not billed as such. I am by no means an Xcode, Apple, or Objective-C apologist, but credit should be given where it is due. I have a different take on XCode. Turns out that allocating objects inside the game loop is pretty much a no-no in Java.

Basically the game loop runs times per second, so anything you allocate will tend to fill up memory fairly quickly. And when garbage collector runs, it introduces delays of milliseconds into the game loop, basically causing very noticeable pauses during game play. Clive Paterson. I pretty much agree this article. I have used a lot of IDEs and xcode is a big behind the rest.

The biggest gripe for me the way. That's a pretty strong statement that you will screw up memory management whatever your best efforts. So apparently Objective-C memory management is hard. That's what Apple thinks, anyway. Do they propose an alternative mechanism to handling resources other than reference counting? It sounds like the above statement is possibly being made in anticipation of the introduction of garbage collection, which would make piggybacking resource destruction non-deterministic.

Whereas, it could also be interpreted as a very strong reason to favor manual maybe automatic reference counting, and eschew GC entirely. I don't know objective-c very well, but I wonder if the use of GC has generated these arguments against it from within the OSX developer community. That's an interesting hypothesis, but I can't find any source to confirm or contradict it offhand.

The part I took the quotes from only mentions that the order of dealloc'ing objects in a collectable object tree is undefined, as is the thread on which dealloc is called. Both of those are easy to keep in mind while implementing dealloc, though. If a resource has to be freed from a particular thread, then dealloc can schedule it to be released on the right thread using GCD.

The non-deterministic order of dealloc'ing would rarely be a problem for releasing resources. After all, if a resource is only used via a particular object, and that object is dealloc'ed, then clearly it's okay to release that resource! Perhaps there are complicated cases where resources have to be released in a particular order, but that's no reason to give up RAII for simple cases. That looks really cool, but it doesn't work in Xcode 4.

Apparently it's a feature in Xcode 4. There's no developer preview for Lion, though. Fingers crossed that Xcode 4. Best I can tell, all of the problems you cite are fixed by MacRuby. It shows how surprising well Ruby semantics maps onto the message passing semantics of Objective C. They also found ways to wrap up the C stuff without making you manage your own memory.

Not sure why Apple hasn't been more aggressive in pushing it for Cocoa development. Maybe because they don't trust it to perform well, yet, on iOS devices and don't want to promote it until it can be used anywhere as a replacement for Objective C. This is not surprising given that both Ruby and Objective-C xeroxed their object models from Smalltalk.

What in the parent post do you disagree with? It's probably obvious to you, but it's not obvious to me. I understood his point to be mostly that syntax melts away after time, and you will just see the concepts. It seems that you are objecting to the notion that "Programming in Objective-C is easy," but I don't see that in his post. ExpiredLink on March 7, root parent prev next [—]. Most ended up being ported to Java or C.

The syntax IS absolutely horrible if you ask me as it results in crazily verbose ways of expressing stuff. Everything is "too meta" and there are very few first class parts of the language.

Add to that the reference counting implementation when GC is not enabled which you can't do on iOS and it's just painful. Also the lack of any decent threading abstraction - ick. I think there is a lot of hype around it.

It's not where we should be in Android does better with bastardised Java if you ask me. Grand Central Dispatch is a lot easier to work with than most explicit threading mechanisms and with the new block support is a lot less verbose than the typical Java thread-based approach. Another cludge in Objective-C.

Most of the verbose mess you see in Java threads is because the person writing it doesn't know much. Your "fugly syntax extension" is your old friend the closure. The syntax is as good as its going to get in an Algol derivative. I'll take it over plain java any day of the week. If Kotlin takes off on Android then we'll have a real contest.

Hm I bed to differ. Have you ever used C? I spend half my day in iOS development and the other half on a Java web stack. But I'm reaching the point thinking: it's , I'm an application developer, why am I spending half my time debugging memory leaks and concurrency issues? Java isn't much better either, why all this boiler plate, and still concurrency nightmares. I've done a handful of side projects with django and that's better, but I still think if I showed my teenage self what I'm programming in, he'd wonder if there was ever any real progress.

I guess what I'm saying is after all these years I want to work on a higher level, as a result I've started to play with Clojure and functional languages. Dispatch queues have made most of my concurrency woes go away; same thing with ARC for memory leaks. Those aren't Objective-C objects, but I get your point. ARC does not relieve you of thinking about memory, it just makes it easier. AznHisoka on March 6, root parent next [—].

ARC to me is scary I got over the huge learning curve, and the nuances of autoreleasing, and retaining, and it seems like now I gotta unlearn all of that??? To me, I'm gonna hold off using ARC as long as possible. Well, you don't have to "unlearn" it. It's actually a good thing you went through the "pain" of learning it pre-ARC because if you understand how reference counting actually works, you will be able to make better and more informed decisions on the management of your objects in 5.

Well, same goes for every non memory managed object in any language. Like, SWT objects in Java. Or file descriptors. Or DB connections. Indeed, as Aaron above mentioned, ARC doesn't absolve the programmer of the responsibility of proper memory management, it's just less overhead to have to worry about. To me, this is almost like asking, "Why, since it is , is the Halting Problem, such a problem?

What made Clojure stick out for me was its easy access to the vast java libs, and its philosophy on concurrency. Both I could be very wrong seemed novel in the functional world. Because Objective-C targets everyday desktop apps and mobile apps. In that space, manual memory management still wins the day. Not many major end user apps outside the enterprise are made with either. Not any famous, widely used ones, anyway. Azureus, maybe, and a few dozen more.

Avoiding GC doesn't equate to "manual memory management. I'm pretty sure most popular scripting languages use reference counting instead of garbage collection. Also, as of a few years ago, the only performance-related reason why the JVM wasn't a popular language for desktop GUI apps was startup time. In the mobile space, it might be true that Java isn't fast enough on current hardware.

My experience with Android hasn't been very inspiring, for sure. Keep in mind that desktop GUI frameworks take a HUGE amount of time and labor to create, and almost all of the excitement has been in web apps for the last decade.

The status quo in GUI frameworks is heavily colored by history. All of the major GUI application frameworks are ancient and reflect the linguistic realities of the year much more than they reflect current technology. I don't think so. Besides startup time, Swing was always slow --an over-engineered mess.

For some Java people it was always "fast enough in the latest version" like for some Linux people it was always "the year Linux wins over the Desktop" , but even the best Swing UI had perceptible lags over a bog standard native. Swing also had the uncanny valley effect, trying to mimic native UIs. If we judge Java by Eclipse, can we judge C by iTunes? Except for that damned startup time. Swing's a mess, but if you're looking for the technical limitations of a language platform, it's the best performers that are relevant, not the worst performers.

Otherwise, iTunes is evidence that even C is just too slow. There's also Minecraft, one of the best-selling videogames of That's because your MacBook Pro is poorly designed. I had the same problem with mine. One reason I sold it - impossible to sit with it on my lap. Minecraft isn't as popular as it is because it's the result of brilliantly-written Java; it sells so well because it's a brilliant concept.

That's true, but I'd guess that's true of much desktop software. I would bet giant piles of legacy code are a bigger reason for not moving to C than anything language-specific is. And Notch himself is well known for his inefficient magic-number and circular-reference ridden Java code. Although I can't attest to Jeb who is now the lead dev 's coding skill. Uhhh, Android? Does Android count? The compact Dalvik Executable format is designed to be suitable for systems that are constrained in terms of memory and processor speed.

It's still garbage collected. Dalvik does fix the startup issue with better class files, yes. RandallBrown on March 6, root parent prev next [—]. It has a WPF frontend anyway. I'm sure it drops into native code pretty quickly.

Pretty interesting read. That's down to WPF which is a total pile of shit to be honest having spent the last 3 years with it. It's nothing to do with the language. I'm kind of fond of WPF. Lots of aspects of it data binding, styles, templates, the layout system etc seem very elegant to me. It is not without issues. How do you justify the 'total pile of shit' call? Requires much faster kit with graphic hardware acceleration to run we had to bin about Matrox Parhelia cards and replace with hefty NVidia cards to make use of hardware acceleration where GDI was fine on Matrox.

Editor sucks. Hard to do trivial things. Virtually impossible to produce a scalable composite application. Grinds an 16 core Xeon to a halt inside VS Learning curve from hell this hurts on a 20 man team. It's not good progress - it's just a deeper abstraction. I can't argue with most of those. The designer sucks, and I blame that for lots of the VS slowness. I never open XAML files in the designer. Not sure about 'scaling up' relative to GDI - I guess if you're a gun GDI programmer you can probably make it do pretty much anything, but I felt more productive doing graphics stuff in WPF - seemed to let you do quite a few cool things pretty easily.

ILMerge thing is a pain, but not a major one unless you've gone out and built thousands of assemblies and are getting slammed by load times, in which case you kind of painted yourself into a corner there. When you say "Virtually impossible to produce a scalable composite application. I can't help but feel if MS had paid more attention to perf maybe re-platform it on top of Direct2D in the.

Anyway, thanks for sharing those pain points.



0コメント

  • 1000 / 1000