Octave is ok for that, but not as good as Matlab. All the Matlab plotting functions work, but the plots don't look as nice, and require more tweaking to get them to "publication quality". Octave includes a Qt-based GUI, but it is not nearly as polished as the Matlab GUI.
Python is also just ok. You can use matplotlib[1] to create plots using an interface explicitly modeled on the Matlab interface, or use higher-level wrappers like Seaborn[2] which reduce the amount of work required to create nice-looking plots of certain types. "Zoom to different features" can be done via Pandas[3], or directly via NumPy, although the syntax can get a little cumbersome. The best Matlab-like GUI is probably "Scientific mode" in PyCharm[4], which requires a paid license. Jupyter notebooks[5] are a free option, but don't include all the features you'd have in a Matlab-style GUI.
R is, in my opinion, best-in-class for exploratory data analysis. The combination of ggplot2[6] and dplyr[7] makes plotting, drill-down, aggregation, etc. easier than any other environment I've used. Chapters 2 and 4 of the free "R for Data Science" book[8] provide great examples of this. The RStudio[9] GUI is comparable to Matlab in features and overall polish, and it's available with either an AGPLv3 or commercial license. Jupyter notebooks also work.
The major downside of R is that it's very optimized for interactive use, which makes deploying it to any sort of production environment tricky and error-prone. I'd mostly advise doing your ad hoc analysis in R, but rewriting things in another language if you want to deploy them. The other downside is that all the major deep learning libraries use Python, so if you want to do work in that space you should really just use Python.
Python is also just ok. You can use matplotlib[1] to create plots using an interface explicitly modeled on the Matlab interface, or use higher-level wrappers like Seaborn[2] which reduce the amount of work required to create nice-looking plots of certain types. "Zoom to different features" can be done via Pandas[3], or directly via NumPy, although the syntax can get a little cumbersome. The best Matlab-like GUI is probably "Scientific mode" in PyCharm[4], which requires a paid license. Jupyter notebooks[5] are a free option, but don't include all the features you'd have in a Matlab-style GUI.
R is, in my opinion, best-in-class for exploratory data analysis. The combination of ggplot2[6] and dplyr[7] makes plotting, drill-down, aggregation, etc. easier than any other environment I've used. Chapters 2 and 4 of the free "R for Data Science" book[8] provide great examples of this. The RStudio[9] GUI is comparable to Matlab in features and overall polish, and it's available with either an AGPLv3 or commercial license. Jupyter notebooks also work.
The major downside of R is that it's very optimized for interactive use, which makes deploying it to any sort of production environment tricky and error-prone. I'd mostly advise doing your ad hoc analysis in R, but rewriting things in another language if you want to deploy them. The other downside is that all the major deep learning libraries use Python, so if you want to do work in that space you should really just use Python.
[1] https://matplotlib.org/
[2] https://seaborn.pydata.org/
[3] https://pandas.pydata.org/docs/user_guide/indexing.html
[4] https://www.jetbrains.com/help/pycharm/matplotlib-support.ht...
[5] https://jupyter-notebook.readthedocs.io/en/stable/notebook.h...
[6] https://ggplot2.tidyverse.org/
[7] https://dplyr.tidyverse.org/
[8] https://r4ds.had.co.nz/
[9] https://rstudio.com/products/rstudio/