Does work give life more meaning?

In your humble opinions, does work provide more meaning in life vs a life where work isn't something involved.

Kind of interested in all sides of the question - wether the choice to work or not is necessary/mandatory to maintain a good quality of life, if enjoying the work you do is enough to add meaning, or maybe the impact it has on the overall world?
Matters,
If you are simply working to survive then no, you must overcome yourself and improve and get out of your comfort zone

I believe that the meaning of life is to overcome yourself and better yourself, become unrecognisable in your wonder

Life is all about adaptation, I dislike the idea of having one job, I started my business while I was in university, and now I am a multi millionaire, yet I still work many mundane jobs for certain periods of time to overcome myself and better my knowledge, sharpen my skills, and overall better myself as a person

You should always strive to better yourself