Human Generated Data

Title

Untitled (people at party outside, man with accordion)

Date

c. 1950

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21640

Human Generated Data

Title

Untitled (people at party outside, man with accordion)

People

Artist: John Howell, American active 1930s-1960s

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.4
Human 99.4
Person 99.2
Person 98.7
Person 98.5
Person 96.5
Person 95.9
Furniture 92.5
Musical Instrument 90.7
Chair 87
Musician 86
Tie 85.9
Accessory 85.9
Accessories 85.9
Suit 83.5
Overcoat 83.5
Clothing 83.5
Coat 83.5
Apparel 83.5
Accordion 82.6
Person 77.6
Photo 66.1
Photography 66.1
Music Band 66
Face 64.6
Portrait 64.6

Imagga
created on 2022-03-05

musical instrument 44.2
accordion 35.8
wind instrument 31.8
people 31.8
man 31.6
group 30.6
business 30.4
keyboard instrument 28.3
male 27.6
person 23.8
businessman 23.8
men 23.2
office 21.7
room 21.5
laptop 21.4
table 20.8
meeting 20.7
corporate 20.6
team 20.6
adult 19.5
computer 19.4
chair 19.3
work 18.8
women 16.6
communication 15.1
interior 15
hall 14.9
teamwork 14.8
modern 14.7
businesswoman 14.5
together 14
restaurant 13.9
classroom 13.8
businesspeople 13.3
executive 13.3
handsome 12.5
professional 12.4
teacher 12.4
indoors 12.3
sitting 12
happy 11.9
suit 11.7
education 11.3
worker 11.1
working 10.6
couple 10.4
desk 10.4
employee 10.3
two 10.2
inside 10.1
lifestyle 10.1
board 9.9
silhouette 9.9
device 9.9
conference 9.8
job 9.7
photographer 9.7
sax 9.7
success 9.7
student 9.6
glass 9.3
casual 9.3
smile 9.3
building 9.2
oboe 9.1
musician 9
black 9
technology 8.9
style 8.9
students 8.8
discussion 8.8
urban 8.7
boy 8.7
reading 8.6
friends 8.4
attractive 8.4
study 8.4
manager 8.4
window 8.2
indoor 8.2
teenager 8.2
music 8.1
happiness 7.8
diverse 7.8
portrait 7.8
notebook 7.8
class 7.7
diversity 7.7
youth 7.7
workplace 7.6
learning 7.5
bass 7.5
leisure 7.5
floor 7.4
design 7.3
confident 7.3
outfit 7.3
looking 7.2
day 7.1
cafeteria 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 97.7
chair 95.2
black and white 93.6
furniture 92.7
text 84.9
clothing 81.3
people 79
table 71.7
man 61.7
crowd 0.8

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Male, 99.7%
Happy 92.5%
Surprised 2.4%
Calm 1.7%
Confused 1.5%
Sad 1.1%
Disgusted 0.4%
Fear 0.2%
Angry 0.2%

AWS Rekognition

Age 33-41
Gender Male, 88.4%
Calm 99.7%
Fear 0.1%
Surprised 0.1%
Happy 0.1%
Disgusted 0%
Sad 0%
Confused 0%
Angry 0%

AWS Rekognition

Age 25-35
Gender Male, 98.9%
Sad 40.4%
Calm 19.5%
Happy 10.4%
Fear 8.1%
Confused 6.9%
Surprised 6.5%
Angry 4.5%
Disgusted 3.6%

AWS Rekognition

Age 41-49
Gender Male, 98.6%
Calm 75.1%
Happy 14.2%
Surprised 5.3%
Sad 1.6%
Confused 1.4%
Fear 1.3%
Disgusted 0.6%
Angry 0.5%

AWS Rekognition

Age 39-47
Gender Male, 98.9%
Calm 82.2%
Sad 6.2%
Happy 4.5%
Surprised 3.9%
Confused 1.7%
Disgusted 0.7%
Fear 0.4%
Angry 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Chair 87%
Tie 85.9%
Suit 83.5%

Captions

Microsoft

a group of people standing around a bus 87.1%
a group of people standing next to a train 87%
a group of people standing in a room 86.9%

Text analysis

Amazon

PEE
KODAK2VLELA
PO