Human Generated Data

Title

Untitled (people at debutante ball)

Date

c. 1966

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19258

Human Generated Data

Title

Untitled (people at debutante ball)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1966

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19258

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 98.2
Human 98.2
Person 97.1
Clothing 97
Apparel 97
Person 97
Person 96.6
Person 94.3
Person 90.9
Person 90.8
Suit 89.4
Overcoat 89.4
Coat 89.4
Person 89
Person 79.7
Crowd 79.7
Robe 72.8
Fashion 72.8
Gown 70.9
Person 69.1
Tuxedo 69
Face 64.9
Text 64
Wedding 63.7
Audience 61.7
Female 60.8
Dating 60.2
Party 58.9
Wedding Gown 56.2
Evening Dress 55.4

Clarifai
created on 2023-10-22

people 99.8
group 98.6
man 97.3
adult 96.5
woman 96.4
movie 93.5
portrait 93.4
family 90
wedding 89.9
television 89.4
actor 88.9
wear 88.1
administration 88.1
music 86.8
dress 86.7
actress 83.5
dinner jacket 81.9
leader 81.7
group together 81.4
many 79.8

Imagga
created on 2022-02-25

man 32.3
teacher 29.5
person 28.7
people 27.9
professional 26.5
male 26.3
business 26.1
businessman 24.7
adult 21.6
office 21.2
educator 19.7
laptop 19.5
sitting 18.9
work 18
desk 17.9
smile 17.8
group 17.7
happy 17.5
education 17.3
computer 17.2
suit 16.3
team 16.1
smiling 15.2
communication 14.3
student 14.1
job 13.3
black 12.9
corporate 12.9
businesswoman 11.8
fun 11.2
women 11.1
silhouette 10.8
worker 10.7
studio 10.6
classroom 10.6
attractive 10.5
couple 10.4
tie 10.4
men 10.3
board 10.2
youth 10.2
sport 10.1
school 9.9
success 9.6
table 9.5
planner 9.5
notebook 9.5
learn 9.4
executive 9.4
meeting 9.4
expression 9.4
child 9.3
two 9.3
jacket 9.2
world 9.2
indoor 9.1
cheerful 8.9
working 8.8
boy 8.7
love 8.7
play 8.6
college 8.5
pretty 8.4
study 8.4
color 8.3
glasses 8.3
active 8.3
technology 8.2
knowledge 7.7
exam 7.7
employee 7.6
casual 7.6
workplace 7.6
finance 7.6
friends 7.5
manager 7.4
event 7.4
shirt 7.4
looking 7.2
home 7.2
paper 7.1
happiness 7

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

wall 97.9
text 97.3
clothing 94.7
man 91.9
person 90.4
woman 89
human face 82.9
smile 81.3
black 80.8
posing 79.9
white 66.4
poster 52.9
picture frame 10.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 25-35
Gender Female, 99.9%
Happy 96%
Surprised 1.2%
Fear 0.7%
Angry 0.6%
Calm 0.5%
Disgusted 0.4%
Sad 0.3%
Confused 0.2%

AWS Rekognition

Age 24-34
Gender Female, 94.4%
Happy 97.5%
Calm 1.5%
Sad 0.3%
Surprised 0.2%
Fear 0.1%
Angry 0.1%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 27-37
Gender Male, 99.4%
Calm 72.1%
Happy 21.3%
Sad 2%
Angry 1.6%
Confused 1%
Surprised 0.7%
Fear 0.7%
Disgusted 0.6%

AWS Rekognition

Age 18-24
Gender Male, 86.3%
Calm 72.8%
Sad 9.4%
Confused 8.8%
Surprised 3.3%
Angry 3.2%
Fear 1.1%
Disgusted 0.9%
Happy 0.5%

AWS Rekognition

Age 53-61
Gender Male, 99.3%
Calm 93.2%
Sad 2%
Angry 1.7%
Confused 1.3%
Disgusted 0.6%
Surprised 0.5%
Happy 0.4%
Fear 0.4%

AWS Rekognition

Age 25-35
Gender Male, 52.8%
Sad 64.4%
Calm 11%
Fear 9.4%
Angry 4.2%
Happy 4%
Disgusted 2.7%
Confused 2.6%
Surprised 1.7%

AWS Rekognition

Age 21-29
Gender Female, 92.3%
Confused 42.2%
Sad 27.4%
Calm 21.7%
Fear 2.6%
Surprised 2.3%
Angry 1.5%
Disgusted 1.2%
Happy 1.1%

Microsoft Cognitive Services

Age 28
Gender Female

Microsoft Cognitive Services

Age 29
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.2%
Person 97.1%
Person 97%
Person 96.6%
Person 94.3%
Person 90.9%
Person 90.8%
Person 89%
Person 79.7%
Person 69.1%

Text analysis

Amazon

33
52

Google

452 33
452
33