Is it better to plant in wet or dry soil?
Yes, you must wet the soil before planting. It is a crucial step to make sure that your planting is going well. Skipping this step can make your plants not well rooted in the soil or make them dry and die shortly after placing them on the soil. Dry soil will not do the job.
Let's Get It Fixed!
Our virtual experts can diagnose your issue (for free!) and resolve simple problems.