Human Generated Data

Title

Untitled (crowded tables at banquet)

Date

1965

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19330

Human Generated Data

Title

Untitled (crowded tables at banquet)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1965

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99.8
Person 99.8
Person 99.4
Audience 98.9
Crowd 98.9
Person 98.5
Person 97.5
Person 97.5
Person 96.8
Person 96.7
Person 96.4
Person 95.5
Person 91.8
Person 91.2
Person 88.9
Person 88.6
Person 88.2
Furniture 86.7
Chair 86.7
Classroom 86.6
Room 86.6
School 86.6
Indoors 86.6
Accessory 84.3
Accessories 84.3
Sunglasses 84.3
People 78.9
Interior Design 76.5
Person 76.2
Cafeteria 68.7
Restaurant 68.7
Person 65.4
Person 61.5
Sitting 61.4
Speech 60.1
Lecture 60.1
Person 48.8
Person 43.1

Imagga
created on 2022-03-05

room 44.2
person 41
teacher 41
classroom 40.1
man 39.6
people 35.1
professional 34.3
meeting 33.9
businessman 33.5
male 33.3
business 32.8
table 31.1
adult 30.8
group 29.8
businesswoman 29.1
office 27
educator 25.9
student 25
team 24.2
businesspeople 22.8
corporate 22.3
executive 22
smiling 21
couple 20.9
happy 20.7
men 20.6
teamwork 19.5
colleagues 18.5
entrepreneur 18.1
work 18.1
senior 17.8
communication 17.6
sitting 17.2
desk 17
conference 16.6
education 16.5
talking 16.2
indoors 15.8
together 15.8
job 15
presentation 14.9
indoor 14.6
manager 14
laptop 13.8
worker 13.4
mature 13
lifestyle 13
success 12.9
discussion 12.7
women 12.6
modern 12.6
suit 12.6
portrait 12.3
20s 11.9
confident 11.8
discussing 11.8
40s 11.7
class 11.6
enrollee 11.5
drinking 11.5
smile 11.4
cheerful 11.4
chair 11.4
home 11.2
successful 11
board 10.9
seminar 10.8
working 10.6
30s 10.6
computer 10.4
adults 10.4
looking 10.4
coworkers 9.8
businessmen 9.7
interior 9.7
collar 9.6
workplace 9.5
friends 9.4
hall 9.3
restaurant 9.2
training 9.2
horizontal 9.2
occupation 9.2
life 9
handsome 8.9
collaboration 8.9
partners 8.7
retired 8.7
cooperation 8.7
corporation 8.7
happiness 8.6
career 8.5
center 8.5
speaker 8.5
showing 8.4
camera 8.3
holding 8.3
blackboard 8.2
associate 7.9
school 7.8
teaching 7.8
busy 7.7
old 7.7
elderly 7.7
hand 7.6
college 7.6
drink 7.5
learning 7.5
study 7.5
wine 7.4
color 7.2
black 7.2
day 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 100
clothing 93.3
human face 91.3
text 87.3
man 85.9
people 85.1
group 77
concert band 11.8
crowd 3.2

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Female, 68.9%
Calm 48.7%
Sad 37.6%
Fear 5.4%
Angry 2.7%
Confused 1.6%
Happy 1.5%
Surprised 1.4%
Disgusted 1.1%

AWS Rekognition

Age 45-53
Gender Male, 99.9%
Sad 87.3%
Happy 8.9%
Confused 1.4%
Calm 0.9%
Surprised 0.5%
Disgusted 0.5%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 37-45
Gender Male, 96.9%
Happy 73.3%
Sad 14.2%
Calm 7.9%
Confused 1.8%
Angry 1%
Disgusted 1%
Surprised 0.7%
Fear 0.2%

AWS Rekognition

Age 45-51
Gender Male, 97.6%
Calm 94.3%
Sad 3.1%
Happy 1%
Fear 0.7%
Confused 0.4%
Disgusted 0.2%
Angry 0.1%
Surprised 0.1%

AWS Rekognition

Age 45-53
Gender Female, 79.2%
Happy 37.2%
Calm 32.4%
Sad 8.7%
Surprised 8.7%
Angry 7.3%
Disgusted 3.4%
Confused 1.2%
Fear 1%

AWS Rekognition

Age 48-56
Gender Male, 99.9%
Calm 99.7%
Happy 0.1%
Sad 0.1%
Confused 0%
Disgusted 0%
Surprised 0%
Fear 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Chair 86.7%
Sunglasses 84.3%

Captions

Microsoft

a group of people sitting in front of a crowd 93.6%
a group of people sitting and standing in front of a crowd 93.5%
a group of people sitting in chairs in front of a crowd 93.4%

Text analysis

Amazon

5
9
a
MJ13
MADO
YT37A2
MJ13 YY37A2
MAGOM
MJIA YT37A2
MJIA
YY37A2

Google

D.
Y
A°2
ACO
D. MJIR Y T 33 A2 XACOX YT33 A°2 ACO
MJIR
A2
XACOX
33
T
YT33