When I was a struggling aspiring speaker at the Agile Alliance‘s annual flagship conference, I was frustrated and wished they would tell me more about how to make my proposal conference-worthy. Now in my second year as a track team member, I understand better why they didn’t.
The conference
The Agile20nn conference has been happening, as best I can determine, since 2002 and today it’s one of the largest of its kind. Agile2016 in Atlanta will attract about 2,500 attendees and dozens (hundreds? not sure) of speakers and volunteers across 17 tracks. Size makes it easy to offer something for everyone. Managing an event at this scale is non-trivial!
The outside

I’m still waiting for the pony.
I made my first session proposals to Agile2014, and received pleasant form rejection letters at the end of the selection process. I was frustrated, and wished for several things: feedback to know why I hadn’t been chosen; coaching to improve the quality of my future proposals; and a pony.
The inside
As a track reviewer for Agile2015, I initially wanted to be more proactive about providing feedback to submitters, and I was confused and a little frustrated when program chairs and track chairs asked us not to. The good news was, they’d added a new Help queue/coaching option which allowed interested submitters to request more detailed guidance from dedicated volunteers. But I still didn’t understand why we were so strongly cautioned against providing feedback to submitters who hadn’t specifically asked for it. Why shouldn’t we offer to be helpful?
This year for Agile2016, I’m track co-chair for Leadership. Now I’m one of the big meanies asking my new track team volunteers not to give unsolicited feedback to submitters who don’t use the Help queue/coaching option. Based on last year’s experiences, I think I understand better why we don’t.
The numbers
I served on two tracks for Agile2015: Leadership, with 13 team members evaluating about 100 proposals, and Collaboration, Culture, and Teams, with 16 team members evaluating about 200 proposals. We try to ensure that at least 3 team members provide a detailed evaluation for each session. (Evals are private, shared only within the track team.) The majority of proposals come in right at the submission deadline, so it can be a scramble to get everything read and scored and evaluations written in the 4 weeks before our teams’ decision deadline.
Bottom line: sustainability. It’s really tempting to want to give detailed feedback early in the submission window, when proposals are coming in at a trickle and there’s plenty of time to read and ruminate. Later, as we get busier, we won’t have time to offer unsolicited help.
There’s an element of fairness as well, because if we give detailed unsolicited feedback to some submitters who haven’t asked, we really ought to give similar attention to all.
The quality
The entire submission system is open, so if you create an account you, yes you, can read all of the proposals that have been submitted to the Ready queue. I’m going to go out on a limb and suggest you’ll find that not all submitted proposals merit serious consideration. There’s a minimum quality bar that some don’t meet, and there’s evidence every year that some submitters didn’t read the instructions.
Bottom line: discretion. As a reviewer, honestly, if I think a proposal is irredeemably terrible, I don’t feel great about saying so to the submitter, and I’m not sure it’s a kindness for them to know when they haven’t asked.
Also, redundancy. When the submitter didn’t follow readily-available advice, there’s no sense in typing those same advices again, and certainly not if they haven’t asked.
The expectations
Each of my tracks ultimately approved just under 2 dozen sessions, out of 100 and 200 proposals. This means there are a great many proposals, even decent ones, that won’t be selected no matter what the submitter does.
Bottom line: realism. It doesn’t make sense to invest so much of our volunteers’ time giving feedback that wasn’t asked for and won’t change the outcome.
I also worry that unsolicited feedback would create incorrect expectations for the submitter—”if I do this I’ll be selected” or worse, “they must really like me”—and make them even more frustrated later when they still don’t make it.
The people
Track teams are made up of passionate human agilists, which means we have great diversity of opinion and often score the same session quite differently. The track teams’ selections are recommendations to the program team, who have the final say. There is no single person who knows the truth of why a proposal was or was not accepted, and there’s no single piece of feedback that could guarantee acceptance this time or next time or ever.
Bottom line: variability. To be meaningful, we couldn’t provide feedback from just one reviewer—we would need a diversity of opinions from a few reviewers, a track chair, and perhaps a member of the program team too! Now we’ve got conflicting feedback. What’s a submitter supposed to do with that?! And because track teams change every year, it might not be an accurate representation of what they should do next time.
The alternatives
If you care about becoming a speaker at this conference, there’s actually quite a bit of advice already out there, and there are some solid resources for you to help you give your proposal its best chance at success.
- Read the program team’s guidance. All of it. Then read it again.
- Read the provided sample successful proposal. Is yours as robust?
- Choose your track thoughtfully. Read your track’s description and goals. In your proposal, clearly spell out how you align with those goals—don’t make the team guess.
- Look through previous years’ successful proposals. Go to the submission system for past years, locate the Tracks page (Agile2015, Agile2014), select an individual track, scroll down to see the list of their accepted sessions for that year, and click any session to see its complete proposal—the published abstract and the submission-system-only program team info.
- If you want help, ask for it! That’s what the Help queue is for. Use it!
I’m thinking about another blog post describing my submitter journey, and some of the other stuff I tried and figured out that helped my proposals. Stay tuned!