Alright before I rant it out, I’d like to be clear to everyone that I have no offence against any jobs or occupations or careers here, as I’m just voicing out my personal opinions.
Having graduated from local uni for 3 years since the pandemic, I’ve worked for 2 companies, and I’d say they’re both pretty fulfilling, as I get to learn a lot of new skills, how to deal with clients especially problematic ones, liaise with customers, having courage to argue with my bosses whenever necessary, pretty much equipped myself with a lot of skills to upgrade myself.
However, some of my insurance agent acquaintances and friends who became these so called wealth planners or financial agents have been approaching me to persuade me to join them, telling me things like oh with your skills, you can definitely make it, you have the potential to become very successful, you can easily achieve MDRT within 1-2 year’s time based on your strong selling skills, blab la bla, so on and so forth.
Honestly I’ve met quite a number of insurance agents, be it from uni, from gym, from dating apps, from friends/acquaintances, and 90% of them graduated from private uni, are uni dropouts, or didn’t even complete go to poly/JC, and that is the main reason why I stay away from this industry.
They can be ah lian, girls who dress up as if they’re on catwalk, guys with tattoos that are easily noticeable. No offence guys, to me, there are plenty of other jobs that make a good living as well, and these jobs do require uni degrees such as lawyer, doctor, pharmacist, scientist, engineer, lecturer, teacher, merger and acquisition, corporate banker, etc.
I really don’t wish to waste my degree on a job that doesn’t need a degree AT ALL. Yes, if I do well, if I can sell, if I can maintain good relationship with client, if I can expand my network, I can make a ton of money from selling wealth plans and my degree is no longer important.
I get it, I’m totally aware of it. But from my stand point, I just don’t understand why people keep thinking that we must or more like we should join insurance to make money, I seriously don’t get it.
It’s just like prostitution, yes, it’s legal, you can make a living out of it, but I don’t enjoy the job so I don’t do it. In fact, I got so mad at one of them and told him straight to his face “eh any tom deek and harry can go join your prudential as long as they put on suit and tie, and can sell, stop trying to pull me in”.
Is it wrong for enjoying a career that requires a uni degree that I enjoy every single day? Frankly speaking, my preference to stay away from a career that is pursued by most less-educated people and unwilling to associate myself with similar people especially those who didn’t go to uni is really just a preference.
I don’t enforce anyone to follow my actions, I don’t discriminate them when I’m doing my job, and I certainly never think that it is wrong to pursue a career catered mostly to those less-educated.
In fact, in my previous job, I had to deal with any type of customers, and I had a great time talking to them for hours and meeting them for contract discussion outside office.
So guys, if any of your friends DO NOT want to join insurance, just stop forcing them to join. Just take it as we don’t fancy flexing our BMW and Rolex on the Gram can bruh???