It all depends on your requirements honestly. It might seem easy to have a single workflow but there are some challenges with it.
Migrating a single workflow with hundreds of sessions take time to do the conflict resolution. You can make this easier by using re-usable sessions but you have to enforce that practice.
You can use worklets that let you group sessions together to keep the master workflow easier to read.
You need to think about restart ability. If a session fails do you want to stop the workflow at that point and fix and reprocess or do you want to continue and then you need to be able to restart just that failed task(s) or restart from the failure but design the mappings in such way as they do not double process records.
Now a single workflow for staging data into a system, a single workflow works well as everything is truncate reload so you can run 20 sessions at a time and blow thru the workflow.
You can see there are lots of approaches. We do not know enough about what type of sessions/design you are going for.
Adding to Scott's notes, from my own experience I know that many customers tend to have one session per workflow as a general recipe while other customers tend to create long and complex workflows.
Personally I sometimes go this road and sometimes the other road. For example, my last project was a testing framework which my customer can use in order to check for every single system upgrade (be it a DB server, PowerCenter itself, some mainframe database, or whatever) whether data transfer and storage / retrieval still works as supposed after each upgrade. The system consists of 35 workflows; several of these contain only one session, several contain 2-5 sessions, the central "scheduler" workflow consists of 10 sessions plus 5 other tasks, and there are app. 20 workflows which consist of everything from 10 to 50 sessions and other tasks.
The central "scheduler" is the most complex one, but even this one is clearly structured. It should not become more complex than it is now, but the big advantage is that my customer only has to start this one "scheduler" workflow once the actual requirements have been keyed into a DB table, and that's it, everything else works automatically (orchestrated by this workflow and a couple of controlling tables).
So the infrastructure is a bit complex. But except for this "scheduler" workflow all other workflows are set up quite simply: each branch in each of these workflows follows exactly the same pattern as all other branches.
This means: after having read the initial explanation for this framework, working through these workflows is fairly straightforward. The workflows are so huge because they have to cater for up to six different use cases per execution, and each use case may need up to 7 single tasks.
Nonetheless almost all these workflows have a plain simple structure. Defining the framework and all the individual functionalities was the real challenge.
Personally I prefer to set up workflows as "functional entities". Each workflow should perform one single logical "action". Keep each "functional entity" as simple and clear to understand as possible (and as complex as necessary).
Orchestrating these workflows, that's a task for some scheduling system (although a workflow can do that as well, as in my case above).
It may make sense to have one session per workflow in e.g. 50% of all cases and to have 2-5 sessions in each other workflow.
It may make sense to put all 100 sessions into one workflow. But then - as mentioned by Scott - restartability is an issue you must not neglect; and deployments take some time.
It really depends on the actual requirements in each case. There is no other general rule which I would recommend to follow.
Regards and sorry for the lengthy post,