#022 | How to achieve 10x more with this one skill

Weekly newsletter on Natural Language Processing (#NLP365), Entrepreneurship, and Life Design content!

Hey friends,

At some point in our lives, we probably came across the debate between hard work vs smart work; thinking that we probably be better off working smartly and achieving our goals with less amount of time.

Some goals can be achieved simply by working smartly ad spending less time. Others require both hard work and smart work. The level of hard work and smart work we need depends on the goals we are striving to achieve. The bigger our goals are, the more we need to work, regardless of how smart we work.

Now, there’s a limit to hard work given both the 24 hours time constraint and the actual utilisation of these 24 hours. Here’s where all the productivity tricks and tools become popular, to help you utilise more of your 24 hours by procrastinating less and working more efficiently.

But what about smart work? The best way I can explain smart work is think of it as a navigator trying to strategise different pathways from point A to point B and hard work is the actual execution of the pathway from point A to point B.

We all know that smart work is important. In flight navigation, there’s the 1 in 60 rule of thumb, stating that if the plane is off by 1 degree from its course, it would miss its target location by 1 mile for every 60 miles!

In practical terms, this means that the bigger your goals are, the longer it takes to reach them, the more important it is for you to be working smartly (and/or hardly). In this instance, I strongly believe in the following:

1% improvement in navigation will triumph 100% improvement in execution

What does this means for you?

Most of us are busy executing and working all the time and barely stopping to reflect on the work that we are actually doing. It’s easy for us to get into the routine and working on the wrong things and wonder why we never achieve what we set out to achieve.

It’s, therefore, important to schedule in weekly / monthly checkups on existing projects and ask yourself:

If I continue to do what I am currently doing, will that bring me closer to where I want to be?

If the answer is no for too many days, it’s time to wear the navigator hat and restrategise pathways and new actions plans that will actually bring you closer to your goals!

Don’t work blindly, know why you are doing what you are doing. Work smartly by improving your navigation skills.

Next week I will share some of my systems for doing this that I have find particularly helpful so stay tuned 😊

This week I am reading:

  1. The Laws of Human Nature (26th Apr - In Progress)

  2. Effortless (28th May - In Progress)

  3. Heart Breath Mind (31st May - In Progress)

Total: 36 / 26 books | 3 / 26 level 4 notes | 2 / 12 actions

❓Question of the Week

If you continue to do what you are currently doing, will that bring you closer to your goals?

Share your thoughts by replying to this email. I would love to hear from you! 👻 👻 👻

🐦 Tweet of the Week

💡 Quote of the Week

Left Brain Core Drives are by nature goal-oriented, while Right Brain Core Drives are experience-oriented. Extrinsic Motivation focuses on results, while Intrinsic Motivation focuses on the process — Actionable Gamification

🔥 Recommendation(s) of the Week

No recommendation this week.

🔦 AI Research - List of Papers on Relation Extraction that I have read :)

  1. Few-shot relation extraction via bayesian meta-learning on relation graphs

  2. Relation extraction with explanation

  3. Adapting meta knowledge graph information for multi-hop reasoning over few-shot relations

  4. Hybrid attention-based prototypical networks for noisy few-shot relation classification

  5. Long-tail relation extraction via knowledge graph embeddings and graph convolution networks

  6. A hierarchical framework for relation extraction with reinforcement learning

  7. Relation extraction using supervision from topic knowledge of relation labels

  8. Neural relation extraction via inner-sentence noise reduction and transfer learning

  9. Cooperative denoising for distantly supervised relation extraction

  10. Robust distant supervision relation extraction via deep reinforcement learning

  11. Distant supervision for relation extraction without labeled data

  12. Constructing biological knowledge bases by extracting information from text sources

🎥 This Week on YouTube

Due to unforeseeable challenges, there will be no new youtube video this week! I will try to get back up to speed as soon as possible. Meanwhile, here are some of my previous videos 😊

That’s it for this week! I hope you find something useful from this newsletter. More to come next Sunday! Have a good week ahead! 🎮

More of me on YouTubeTwitterLinkedIn, and Instagram.