While you can technically rank without doing technical SEO 🤣 😆 it's not advised because without doing these things it can hurt the user experience as well as make it harder for Google to crawl and index your website.
Website Speed and Performance
Photos
One of the quickest and easiest ways to improve your website's performance is through image optimization or compression. If you are using WordPress there are plenty of plugins out there that exist that can help you with this task. Such as Smush Pro, Imagify, and EWWW Image Optimizer. Most of them have free trials or up to X amount of images and/or less quality compression for free. While I personally recommend upgrading because the capabilities are far superior.
Videos
Videos use YT, Vimeo, or another host to deliver your videos otherwise they will kill your hosting unless you are on a dedicated server and even then I still recommend using a platform to deliver them.
CDN
The use of a CDN can help boost your performance further by loading your website from a server closest to the visitor.
There are plenty of other things you can do to boost your website performance but these are the most common issues IMO.
Mobile-Friendliness
Having a responsive design is important to ensure that your website visitors enjoy a website that functions on all types of devices. This not only helps with trust it also can help with how long someone stays on your website. If they are using a mobile device and something doesn't look right or might be broken it could leave a bad taste in their mouth.
Mobile First Indexing
Google mainly cares about mobile since the majority of people use mobile devices. So if you don't have a mobile-friendly website then your chances of ranking are slim to none.
SSL
A secure connection (HTTPS) is a ranking signal. Implementing an SSL certificate is crucial for ensuring your site is secure and trustworthy.
XML Sitemaps
XML sitemaps helps search engines understand your site’s structure and find all of its pages. It’s essential to keep the sitemap updated and submit it to Google Search Console.
Robots.txt File
Your robots.txt file tells search engine crawlers which pages or sections of your site should not be crawled. It’s essential to ensure this file is correctly configured to avoid accidentally blocking important content.
Structured Data Markup (Schema)
Using structured data helps search engines understand the content of your pages better and can lead to enhanced listings in the SERPs, like rich snippets
URL Structure
URLs should be short, descriptive, and include relevant keywords. Avoid using complex parameters or unnecessary numbers.
To prevent duplicate content issues, use canonical tags to specify the “preferred” version of a page.
Search Console
Install and check for improvements or issues there.
There is so much more but these are some of the most important!