Flowershow cloud waitlist https://docs.google.com/spreadsheets/d/1CFxv7i2oClJEyAltXR9UsDpylFeeKZAMZ_FnQYtFI5Y/edit#gid=0

Examples:

3:50 PM

https://publish.obsidian.md/chromatically/publish+homepage

https://yomaru.dev/home

Onboarding Datopians

User experience Docs Value

3 June Onboarding Oleg

  • High-level technical design
    • Where it started
    • Where is it today
    • Where is it going
  • Where is it going in your head?
    • Possible directions for our idea
  • High-level technical design

We don't currently have any paying customers. There is a trajectory. It is intended to be a commercial venture. If it gets successful, we will all yield success from it.

3 June Osahon

Access control

UFC dataset

3 June Muhammed

full stack dev

Participated in the hackathon

Are you publishing datasets somewhere?

  • No

Are you using PKM?

  • No

31 May Sneha

Are you publishing datasets somewhere?

  • No

Are you using PKM?

  • No

Currently not using anything similar or publishing or sharing content

Create a dataset from scratch and add some visuals and lemme know feedback

31 May Ronaldo

Are you publishing datasets somewhere?

  • No

Are you using PKM?

  • No

Currently using laptop notes

Currently working with locations …

Publish your city.csv data

31 May Luis

He just created a dataset. He participated with the hackathon

https://datahub.io/@LuisVCSilva/iris_dh https://datahub.io/@LuisVCSilva/datahub_abc

He has other ones he did.

Are you publishing datasets somewhere?

  • No
  • Just some csv files that he is currently storing locally on his computer
  • He is a Phd student and this is what he needs atm
  • He is running experiments, creating data visualizations
    • He created the visualizations and simulation in .py himself
    • He also created a plot.py

Requirements.txt CSV file json file (params.json)

so that every researcher can see the data and use it…

https://github.com/LuisVCSilva/doutorado/tree/main/Se%C3%A7%C3%A3o%20de%20choque/resultados/hxe

Contributor !!!!

Timeseries and comparisson / correlation between two timeseries of data

31 May Luccas M

Are you publishing datasets somewhere?

  • No

Are you using PKM?

  • No

Using calendar for anything note-related and not taking notes or anything similar not.

markdown is a mess. Hashtags in front of headings don't work

Back in flowershow days, it was still a mess.

31 May Michael

Are you publishing datasets?

  • No

Are you using PKM?

  • He is using Obsidian and hackmd

Info dumps and note-taking

Potential user? He doesn't have data that he wouldn't want to share with the world. Anytime he gets data he's interested in, he's just looking into it and not gathering it for any purpose. Not necessarily a user or would be a user. But we tried to pitch this (a very early version of it) to Vitals and he has used it

There are a lot of use cases for this product. Michael and Osahon agreed that a more friendly editor would be great for the use cases.

On one hand, there are data scientists and analysts familiar with markdown, etc.

Huge market of people that are working with data but are not technical data people. They have a dataset and have an idea of a vis.

What you see is what you get kind of editor. There might be a much bigger market there

Demenech said they were thinking more so tech people.

2-3 meetings to try and sell this. They had a hard time picturing what they would use it for. Yearly reports, this, and that.

It is easy but if you didn't know markdown..

A free product even if it is super basic - it can be extensible. Just a basic functionalities.

You love data but you're not so technical. EDITOR!!!

Couple of options for the visualizations

31 May William

Are you publishing datasets?

  • No

Are you using a PKM?

  • Obsidian

Publish your flowershow project in datahub cloud

31 May Yedige

Are you publishing datasets?

Are you using a PKM?

  • Notion, Obsidian, hackmd…

Kaggle you just publish the data and you work on the data

There is no EDA. Users themselves participate in doing the EDA and

Competition platform for datasets

31 May Stephen

Are you publishing datasets?

  • No

Are you using a PKM?

  • No

He created something similar and he is using it for personal use.

https://www.scatterednote.com/ He was thinking of integrating it with Obsidian and using the Obsidian Editor

Template + visuals and give feedback on user exp and docs

He published something.

31 May Shubham

Are you publishing datasets?

  • Nowhere

Are you using a PKM?

  • No

Working on local Sharing via hackmd or github

31 May Shreyas

Are you publishing datasets?

  • No

Are you using a PKM?

  • He uses Notion to make personal notes
  • Excalidraw as well

Do you have some projects eg. for uni or any personal projects that you are publishing somewhere and sharing with people?

  • Not really
  • Hosting the website - creating a website within vercel //
  • He used to use vercel in university as well - his personal projects
  • It lets you host for free unless a lot of data

People are generally confused on Datahub Cloud and which one is this

Action: by following the docs https://datahub.io/docsin particular this tutorial and give me your feedback

30 May Lucas B

Are you publishing datasets?

  • No

Are you using a PKM?

  • Yes, I use Notion on Whattofix and Obsidian for locally editing stuff on datopian.com
  • He is using Hack.md for personal use

He went straight to the enterprise

In the cloud.datahub.io he ignored the instructions and didn't see them.

Atm this is great. But maybe clicking on the button could lead to these instructions. Bigger fonts would work as well. Making it bold. Text adjustments. Font adjustments. The button is more prominent.

Publish one of your hackmd projects with DH cloud and give your feedback.

30 May Marcelino

didn't join

30 May Vinicius

  • What do you do in life?
    • I am working with data analysis and data science and he works in a company of education. He works as a volunteer in a community which is helping homeless people.
  • What need/problem are you trying to solve with Datahub Cloud?
    • He knows Datahub for storing data of his personal projects and projects of his community and group.
  • How did you hear of Datahub? ChatGPT
    • Free storage to use his data projects
  • Where are your data projects currently?
    • Atm he is using google sheets and the free plan of Google cloud platform
    • Bigquery in Google Cloud platform
      • using it on a lot of projects
  • Have you worked with Github before
    • Yes with streamleads (library in python) Link to the app cloud.datahub.io Link to the docs (within the app)

https://streamlit.io/

Markdown?

  • Yes

Datastudio integration?

ABILITY TO MANAGE ACCESS

Send him an email

30 May Leonardo

Are you publishing datasets?

  • No

Are you using a PKM?

  • Obsidian to take notes
  • He is just copy-ing the text and pasting it somewhere. Normally it is just text

Do you imagine using Datahub Cloud for something?

  • Probably for publishing study notes.
  • He is not a fan of publishing.

Quite slow

Publish one of your Obsidian notes with DH Cloud and give your feedback

29 May Ismail

Are you publishing datasets?

  • No
  • He hasn't done it for a while

Are you using a PKM?

  • Notion - for logging stuff
  • hackmd

he is using it for workouts

Replicate this mini diary database of workouts

29 May Meiran

Are you publishing datasets?

Can you try doing

I can use it but it is free ..

Then it's the perfect tool for him

It takes a lot of time for the site to build. Each one

Meiran is a data curator for Curated Open Datasets https://github.com/open-data-kazakhstan/

Is it possible to hide the download data button?

In Kazahstan, people don't keep the credibility that's why he is using Vercel - he can hide everything.

29 May Sagar

Are you publishing datasets?

  • He used to publish from the Open Data Portal in Nepal
  • After that, he used to publish on datahub as well

https://datahub.io/

Are you using a tool for knowledge management?

  • No

Have you signed up to Datahub Cloud?

  • No

Root Dir not entirely clear what he needs to put there

It was slow And then he was in the dashboard and had to go there. After scrolling down, he saw the Visit button and clicked it.


Vinicius 24 May 2024

  • What do you do in life?
  • What need/problem are you trying to solve with Datahub Cloud?
  • Do you currently use another tool to do that?
  • Have you worked with Github before
    • Link to the app cloud.datahub.io
    • Link to the docs (within the app)

Marco 24 May 2024

  • Sweden
  • What do you do in life?
    • Data engineer role. Doing stuff on keyboard.
  • What need/problem are you trying to solve with Datahub Cloud?
    • Actually not sure as well. He registered to Datahub newsletter long time ago and he was comparing some tools as a consultant
  • Are you currently publishing data?
  • Do you currently use another tool to do that?
    • Yes, notion (he used obsidian as well)
    • Publishing PKM
  • Have you worked with Github before
    • Link to the app cloud.datahub.io
    • Link to the docs (within the app)

https://datahub.io/@mighty-mass/potential-octo-spork

Inside the company they are not using Github

Will try it out for himself //

Garried

David Gasquez

Talk to https://github.com/catalyst-cooperative/pudl/ guys tell them that datahub is back after some absence and we'd love to hang out

Let's try to get these guys on board

8 May 2024

Publish your Obsidian Vault to Github https://portaljs.org/howtos/publish-obsidian-vault-to-github and publish it with Datahub Cloud

3 May 2024 El Mehdi Semlali

How did you find us?

What are you looking to get in this meeting?

How do you want it to go?

2 May 2024 Youri

ChatGCP

Freelancer business analyst

He's interested in making data visualizations for Linkedin

In technical savvy level, you're not a coder and you know Tableau.

Business background, knows re R and markdown //

Tableau is very powerful //

18 Apr 2024 Oleg / Daniela

Do hackathons work?? Research project. Unprivileged people that we wouldn't expect on a hackathon. See the effects. Is it helping them network, become more confident with technology, is it making them feel Switzerland like home

Team building that happens at hackathons

DRIPDAT feedback is needed.

Integrate PortalJS into DRIPDAT even better.

Datahub Cloud be part of the hackathon going fwd??

work on tools to help facilitate that.. initial funding from the government…

https://en.wikipedia.org/wiki/Random_Hacks_of_Kindness

https://www.openspending.org/ flash hacks … how many companies would you manage to add to the DB at the end of these 4 hours?

Open Street Map you get additional points…

Feedback loop where we want to hear from people. Gamiying this process. We haven't seen.

Every hack day - keep track of hackathons worldwide …

Hacker ethos … created a lot of mythology man. Somebody could be super productive. 4-hour week.

Open data is great. At the end of the day, it is one principle you can apply. In particular, if you want to get all these govn. organsiations to update their web design.. open data is great for that // open data is digital diplomacy to have all these datasets published in lines and rows.. you can't do hackathons every day.

In the hackathon setting, you see how people work. You can sit next to them and get to experience how they work. Sit and quietly observe. HRs love it!

You're seeing how people work. Conflicts and clashes .. some people need a lot of adaptation. Some people click right away. You can go on social media and you're working in a coworking space. Another facet of public work.

https://rhok.cc/content/what-we-believe/

Hackathon Oleg Project page

https://bd.hack4socialgood.ch/project/97

Datahub => Datahub Cloud

🌀 Rapidly build rich data portals using a modern frontend framework => Publish datasets and data stories using markdown with a few clicks.

From ## How to create data-rich stories with DataHub onwards it is all fine!


https://link.excalidraw.com/l/9u8crB2ZmUo/GQJyQhESVE

There is a pitch on what we can do

But there is also a pitch on who we are

What Apple stands for …

Oleg 1 May 2024

Projects from the hackathon

https://bd.hack4socialgood.ch/event/6

One of the projects was published with DH Cloud –> https://bd.hack4socialgood.ch/project/89 https://github.com/Spaezli/teilhaber/blob/main/BACKGROUND.md#teilhaberin-background

Concept doc: https://s3.dribdat.cc/h4sg/2024/16/KGCPAM5YOP343QE92E6V8L0Z/Durch_Teilhabe_zu_mehr_Teilhabe_en.pdf

template Datahub Cloud for hackathons

I will publish the data packages

Baserow are interested in supporting hackathons. Talk to them and consider adding an integration?? Maybe we can co-develop a baserow plugin for hackathons open data…


Working on a draft the last couple of days… dripdat what's next https://drive.proton.me/urls/PQBN2W1QMG#5ojyvTNieLcH


School of data pipeline 1-7 guiding people through the process…

  • venue to evaluate tech stacks (Angular, Apachy, etc.)
  • hackathons are starting to be taken more seriouslyl
  • a real venue to
  • Marathon = hackathon (nice way to explore the city as well :D
    • tech tourism; parallel to art earth tech // teaching and events
    • why not explore the idea of bundling? Why not become a partner?
    • working with our tech team part-time helping with DH Cloud // python nextjs helping the dev team and seeing if we can position this
      • planning to do exactly what we are doing with this flowershow thing
      • would benefit on the technical side on the business side
      • spark
      • he has been working really hard to ramp down all his engagements and finish all his other projects // as of today, he is very free // his commitment is the grant / research project (he is being funded)
      • would be great to commit 60-80% of time. Doing a lot of customer projects as a freelancer // would be glad to be part of a tech team
      • CKAN team struggling - not being able to do
      • He presented dribdat to LinkDigital // dripdat is a ckan plugin // the problem is timezones // steve is crazy, his working hours are ridiculous
    • I am bringing out a really high quality stuff out of me. My projects, my events, my outputs. I lack this context when working with small local companies etc.
    • SHARED experience.
      • Datahub cloud both technical and psychological
      • Landmark course

https://github.com/frictionlessdata/DataPackage.jl/issues/1

tourismdata.ch

https://frictionless.dribdat.cc/

How do people earn money from open data?

One week before hack4socialgood there was another hackathon in Zurich focused on accessibility web and psysical.. you'd be happy to get some free food.

https://data.world/product/

https://github.com/loleg/earnopendata

Probezeit

Evaluation period - let's start with a small project - 3 months

ups and downs salary statistics ..

github you can see when is Oleg most productive .. up to 90% in total // reserve a few % for the research work // we could be the recipients of the grant … 3 year grant //

David 2 Apr 2024

Agenda

  • Check in
  • Iterate intention of the meeting
    • Show you what changes we made
    • Get your feedback
  • Next steps
  • AOBs

Bugs and things to fix:

  • docs link leads to the old docs
  • storybook portaljs link is broken
  • starting from scratch docs need more explanation
  • datasets 404 link fix

PortalJS storybook:

  • load the code by default
  • He didn't know how to use them until I explained (didn't see the "show code")
  • Group them:
    1. It is confusing. I want to see a linechart but I see 2-3 in different places. I want to see the graph type and not the framework.
  • Eg. Barchart and on click:
    • Plotly
    • Vegalite

Template

  1. Awesome metadata summary at the beginning. But you should have everything filled. It is a template. Currently, there is missing metadata
  2. Docs link is not visible. I don't read it all at first. Add it at the top and make it big (eg. above the Data previews)

Screenshot 2024-04-04 at 16.28.10.png

You can have tags on top of the dataset – Hugging face has tags per each datasets

What would it take for you to start using DH Cloud?

Trying to move away from obsidian publish. He wants more customization. He wants to customize a lot how the website looks like. Requires a lot of network, bandwith. Have better integrations with other components

I want to be able to do:

  • Quicklinks
  • Backlinks Screenshot 2024-04-04 at 16.35.52.png I would also want to be able to customize:
  • Low level html
  • No javascript
  • Being able to paste charts or other react portaljs components
  • One level up of obsidian publish

Other key features needed:

  • SEARCH inside the site, Filter // search bar on the dashboard

Optimize for flexibility. If i am not happy with the search (eg. he is not happy with the Obsidian publish search atm), I want to be able to change it. I want to make it as simple and as fast as possible. LET ME MANAGE.

Simple experience and I want to be able to change anything I don't like.

Lightweight

Oleg 2 Apr 2024

Agenda

  • Quick introductions
  • Reiterate the intention of the meeting
    • Onboard you onto DH Cloud v3 (he was using v2 of the product)
    • Get your expert feedback and suggestions (your input is gold)
  • Share your screen and go to https://cloud.datahub.io/
  • Feedback and suggestions
  • Questions
  • AOBs

Version 3

Notes

Share with others if you wanted to the beta program

24th of May in Fribourg https://opendata.ch/events/opendata-ch-2024-forum/

About the v3:

  • incredibly easy data publishing in the cloud
  • frictionless data packages and any markdown and data, we are trying to make it super easy to publish datasets and data-driven stories

He is working on a new project: Creating jobs that involve data + upskilling component // frictionless data vision of containerizing

He wasn't sure if this is working with private repos.

He wanted to escape github as soon as possible

I didn't have to refresh the page or anything - datahub cloud was there

He saw the name, the branch, the custom domain… and only then he saw the sync button there. Visit button was barely visible. He got back to all sites and then came back in wondering what he should do.

It is very complete. The description here. Maybe it is a bit too much.

I would want to take this over. So i would want to have it in a separate documentation page. It could be a separate markdown file. Template is too much. He would want to reuse it and take over.

Let's test it. I want to change my readme.

Catalog component

  • It was not obvious that these are the only four datasets
  • 404 when click on the datasets in the data ctalog

I don't know how to get back to Datahub, there is no link. It is probably a static page. Wouldn\t want to have Google Analytics but he is not too bothered. Mention Google Analytics in the T&C // GDPR friendly

fathom analytics is preferable

One thing that he is not seeing from the v2 version is the import in the different program languages ; I am expecting to see the

  • Size
  • Last modified

Data CLI won't work. Used to generating data packages so quickly.

What I am expecting from Datahub is to somewhere have the ability to create a data package. Like create.frictionlessdata.io which is terrible but it works.

I don't see a data package in here.

Delete the frontmatter and put the yaml or the json datapackage in the repository.

Completely offline. Nothing goes to any servers.

It didn't work https://datahub.io/@loleg/hello-datahub Error in datapackage layout

https://github.com/loleg/hello-datahub

Demo it for 10 mins and provide it for free

hack4socialgood.ch

Do a little workshop

Conference in May

https://www.zhaw.ch/en/socialwork/hack4socialgood/

Hackathon Hack for social good. At the end of this month.

There is an issue in the yaml.

When we delete files, do we delete them from r2? Does the syncing do deleting?

https://www.zhaw.ch/en/socialwork/hack4socialgood/


Atm it works with index.md or README.md + datapackage.json file in your repo (frictionless data package)

Data package https://specs.frictionlessdata.io/data-package/#language

Questions

  1. How can I check if there are other people who published datasets with DH Cloud? I remember seeing how Rufus is querying this data from Vercel analytics but I can't seem to find where to get this
    1. https://vercel.com/datopian1/~/stores/postgres/store_W6kjDoNzg6sWFXpO/data
  2. What PortalJS components does DH Cloud currently support? https://storybook.portaljs.org/?path=/docs/components-introduction–docs
    1. Yes besides the old table components + OpenLayers + Iframe
    2. How can I eg. integrate a PDFviewer on my site atm? https://storybook.portaljs.org/?path=/docs/components-pdfviewer–docs&globals=backgrounds.grid:!true
      1. Copy it and paste it in the markdown
      2. And supports relative path so pdf can also in github

Could I show you how it works?

Agenda

  • Intro
    • Hello and welcome to the Onboarding of Datahub Cloud! I am excited to onboard you today. Here's what to expect
      • You will have published your first dataset with Datahub Cloud
      • You will know what the current prerequisites are to publish with Datahub Cloud
      • You will have most of your questions answered unless they are strictly technical in which case I will point you to the place where you get to ask them directly
  • No long introductions needed - I am Daniela and I am here to onboard you onto Datahub Cloud and answer any questions you may have during this process. It is time-boxed at 15 mins so we should try to be efficient. This is how I suggest the meeting goes:
  • I will share the onboarding guide and talk you through as you present your screen and follow the steps… please share your screen so that you get your hands dirty
  • Some cool examples of datasets https://datahub.io/core/finance-vix.
  • Feedback?

Examples of cool dataset sites: Finance vix repo https://datahub.io/core/finance-vix to Oil prices repo https://github.com/Daniellappv/oil-prices-this to https://datahub.io/@Daniellappv/oil-prices-this https://datahub.io/core/co2-ppm

Atm it works with index.md or README.md + datapackage.json file in your repo (frictionless data package)

Data package https://specs.frictionlessdata.io/data-package/#language

Features https://app.excalidraw.com/l/9u8crB2ZmUo/2Yi8iB94NSu

  • Catalog functionality
  • Large data files
  • Data API
  • Better viz
  • Built-in Editor
  • Better site layout
  • Working with more data sources

Joel Natividad

Storytelling component

When we deploy

When we were in open gov, one of the values was storytelling. They integrated that. Storytelling purposes they integrate that onto the data portal. This feels this gap. Lowers the bar. You don't need a proprietary software, markdown is pretty intuitive. Almost self-documented.

You may want to give a bit of magic to the template. Beyond the hello world example. Make it appealing.

https://data.dathere.com/showcase/nj-crime-stats

Bar for building dashboard is much much higher with this superset though. Asking some of their clients eg. Boston to do sth like this will be a heavy lift.

  • Superset, Tableau is data vis
  • We want to talk about data storytelling

With Datahub Cloud, the bar is much lower.

https://data.dathere.com/showcase/nj-crime-stats Use ckan showcase extension atm and story spin up focusing on several datasets rather than having this dashboard

To understand the template. it will be great to have a more complex showcase page with all the different components

Two main suggestions

  • Basic template vs Advanced template (advanced one showcasing all components would even solve the documentation need)
  • A way to go from the site back to the repo https://datahub.io/core/co2-ppm

They are building similar things on their end as well - reduce the friction to data publishing and data updates // metadata

They have to massage the data a bit in excel. They need to put columns etc. and then they need to publish. They are also trying to automate a lot of these things metadata

Speaking of frictionless. One of the formats that they support is datapackage. If they populate the datapackage with stats etc. is there a datahub-cloud datapackage

QSV https://github.com/jqnatividad/qsv/releases Data ingestion https://github.com/jqnatividad/qsv

instant aggregations without asking. We just do it. Do the aggregations automatically.

Visualizations work with aggregated data. To really lower the bar and the friction. This data file is not aggregated yet, not report-ready. This is a date field. Then there are multiple fields

Pascal Heus

Questions around the metadata support and the description of the variables in the code. Is there a particular standard or metadata specification?

Github

Data size and ability to work with a data store behind an API is important

He is working on automating the whole data publishing process … Trying to offer a menu of options for data publishers and depending on their need they can cherry pick an API. Entire process using natural languages + API. Will be open source. He wants to demonstrate people what they can do

He wants to remove the constraints. Currently, people need to be data experts, metadata experts… release a platform publishing high-value data

Planning to connect to CKAN and the data catalogue. Attract contributors. Second half of the year this will be more visible.

Will use some AI to remove some of the barriers in technology. All the dots are there, they just need to be connected.

Good platform for storytelling. My initial concern is that it seems that the data needs to be in csv and concerned about the size of data.

  1. Give me the entire dataset is one scenario… or 2. I just want these 10 variables of a dataset. This is where the API becomes important. The ability to subset data to your needs.
    1. Data goes with metadata. If you give me a csv file without metadata, i will be lost.
  2. Or doing storytellingAbility of storytelling around a dataset. Visualization

First, you need to find the data. Know where it exists. Metadata is what is important across the board. Quite a lot of things in the metadata space - harmonising metadata standards. See https://www.researchobject.org/ro-crate/ getting very popular and lots of people are using it

https://github.com/ResearchObject/ro-crate STANDARDS

Gut feeling is: I like the idea of using Github. I just don't know if Github is the right platform.

Interesting data is big. Good data is big data. This is a Myth…

Ketan

✨ Quick Onboarding into Datahub Cloud - By the end of the session, you'll have a personal site showcasing your own dataset!
🌟 Share your needs, insights, and wishlist - your input shapes our product!

Part of the Intelligent Automation team at Citco. As part of that, data governance tool for citco. While researching for the tool, they went through the studies and came across datahub.io.

As part of the process, this will go through a formal RFP/RFI. Who would be the contact point, where should we direct the formal process, etc?

First touch point. Product overview. Just to find out the contact and who is the contact point.

  • A quick introduction and I lay out how the onboard session will go.
  • I then ask — before we start — if they wouldn’t mind describing what they’re working on and why they’re interested in [THE NAME OF YOUR BIZ HERE].
  • I then send over the link to our beta and ask if they wouldn’t mind sharing their screen (for some of the sessions — especially where there are multiple team members on a call — I’ve demoed the tool myself and followed-up with instructions and links after).

Data management capability, data observability, metadata

multiple systems getting integrated with the data lake. How can we go on with these

How do we get the data observability within the citco group? EmployID. They know exactly which table and which field to look into.

Tracking it from the source system to the consumption layer, accessebility.

Situation

We have a Quick Start Guide on the Docs page on the website https://datahub.io/docs.

We also have a waitlist of around 90 people who declared interest in getting access to the App.

We have a discord channel for early access where we already shared the Access link to the App with 8 people from the Open Data Day event.

Screenshot 2024-03-10 at 11.44.43.png

Problem

  1. We don't know if the current instructions are enough for users to get their first value from the product (= publish their dataset successfully). Furthermore, the current instructions stop after the user publishes their first dataset.
  2. We have no idea how the onboarding of our early adopters is going and (if it is going) if they managed to publish a dataset.
  3. We haven't also received any feedback about the product itself and if it's valuable and useful to people.
  4. We don't currently have anyone assigned to proactively check with onboarded users and get their feedback or help them if they need help. We haven't followed up with them since we gave them access.
  5. If we just give people access to the App, we won't have any visibility of their onboarding experience and we won't get any feedback for the product and if it is delivering value.

Appetite

1d of focused work on the onboarding process

Solution

We need a process in place to

  1. guide early adopters to their first value quickly and efficiently (= will be able to publish their first dataset)
  2. understand their experience and journey and how we can improve it
  3. get their feedback on the product and how we can improve it

Bonus

  1. let adopters know we are there for them if they need help or they have questions
  2. bump users back into the straight-line if they fall into the gutter

bowling-alley-framework.webp

Options

  1. With human interaction
    1. Guide them through the whole process (be on the call with them and walk them through) –> an overkill for a process which is relatively straightforward
    2. Get on a 15-minute call with each one of them to collect their feedback –> recommended for the open data day people but not scalable
    3. Chat with them during the onboarding process –> too much time consuming and weird
    4. Communicate with them via email during the onboarding process –> checking in with them regularly via email to bring them back to the product seems a good idea
    5. Some chat support –> nice-to-have but not a must
  2. Without human interaction
    1. More instructions: Create a step-by-step onboarding guide, in-app onboarding tips, FAQs, etc. –> yes, we should improve docs indeed but this doesn't give us visibility
    2. Short survey/form to collect user experience and feedback –> yes, we should give users some form to collect their feedback
    3. Recorded short video showing the process step by step –> nice-to-have but not a must
    4. Use a tool like Usetiful https://www.usetiful.com/ or Product Fruits https://productfruits.com/ –> these are mostly useful for inside-the-app instructions and our app is relatively simple atm (doesn't require much guidance)
  3. Mix of both recommended
    1. With the Open Data Day people:
      1. get on a 15-minute call and get their feedback
      2. if they are not open for a call, send them a survey/form to get their feedback
    2. With the future users:
      1. Improve the instructions based on the user feedback collected
      2. Add more instructions for what to do after they publish their first dataset
      3. Include a feedback form/survey as part of the onboarding process
      4. Communicate with the users via email - send them regular emails to guide them

Rabbit Holes

????

No goes

We don't want to pollute with instructions because we want to keep it simple. We don't want to make it seem complicated and requiring a lot of guidance because we want to keep it easy

I want to walk you through the process. Let us know what your needs are.

First 10 and then email them.

Rapidly test and automise our experience.

© 2024 All rights reservedBuilt with DataHub Cloud

Built with DataHub CloudDataHub Cloud