Human Generated Data

Title

Untitled (nine children posed sitting in living room with presents on the floor)

Date

1953

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9466

Human Generated Data

Title

Untitled (nine children posed sitting in living room with presents on the floor)

People

Artist: Martin Schweig, American 20th century

Date

1953

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.4
Person 99.4
Person 99
Person 98.7
Person 98.5
Person 98.3
Person 97.3
Person 94.6
Clothing 92.4
Apparel 92.4
Furniture 85
Chair 85
Person 84.4
People 83
Suit 82.6
Coat 82.6
Overcoat 82.6
Female 74.7
Indoors 65.3
Room 64.6
Shorts 63.5
Leisure Activities 63.3
Girl 62.3
Crowd 62.2
Shoe 59
Footwear 59
Dress 58.7
Floor 56.3
Evening Dress 55.5
Fashion 55.5
Gown 55.5
Robe 55.5
Performer 55.4

Imagga
created on 2022-01-23

people 35.7
man 35.6
businessman 34.4
male 34
person 32.8
business 30.4
group 25.8
men 25.7
room 25.1
office 24.4
adult 24
meeting 21.7
businesswoman 20.9
classroom 19.6
happy 19.4
team 18.8
teamwork 18.5
table 18.2
professional 18
teacher 17.5
together 17.5
corporate 16.3
sitting 16.3
businesspeople 16.1
job 15.9
women 15.8
smiling 15.2
newspaper 15.2
executive 15
desk 14.2
manager 14
talking 13.3
musical instrument 13.1
computer 12.9
success 12.9
worker 12.6
work 12.6
working 12.4
product 12.4
lifestyle 12.3
successful 11.9
chair 11.9
communication 11.7
colleagues 11.7
indoors 11.4
black 11.4
cheerful 11.4
couple 11.3
boy 11.3
portrait 11
casual 10.2
finance 10.1
indoor 10
smile 10
suit 9.9
modern 9.8
creation 9.8
30s 9.6
percussion instrument 9.6
boss 9.6
education 9.5
ethnic 9.5
happiness 9.4
laptop 9.4
building 9.3
two 9.3
mature 9.3
hand 9.1
marimba 9
outdoors 9
coworkers 8.8
student 8.7
diversity 8.6
four 8.6
friends 8.4
horizontal 8.4
company 8.4
holding 8.2
confident 8.2
educator 8.1
day 7.8
conference 7.8
students 7.8
teaching 7.8
40s 7.8
spectator 7.8
cooperation 7.7
mid adult 7.7
class 7.7
hall 7.7
youth 7.7
friendship 7.5
board 7.3
teenager 7.3
handsome 7.1
to 7.1
employee 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 98.7
text 98.5
clothing 94
man 72.1
group 65.4
musical instrument 57.2
computer 34.1

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 98%
Happy 66.3%
Surprised 27.1%
Sad 4%
Confused 1%
Disgusted 0.6%
Angry 0.5%
Fear 0.3%
Calm 0.2%

AWS Rekognition

Age 33-41
Gender Male, 99.8%
Happy 36.9%
Calm 28.2%
Surprised 10.7%
Disgusted 10.2%
Sad 6.5%
Angry 3%
Confused 2.8%
Fear 1.7%

AWS Rekognition

Age 43-51
Gender Male, 99.6%
Happy 76.8%
Surprised 10%
Calm 7.5%
Sad 3%
Confused 1.1%
Disgusted 0.8%
Angry 0.6%
Fear 0.2%

AWS Rekognition

Age 27-37
Gender Male, 98.9%
Sad 36.5%
Surprised 18.7%
Calm 16.3%
Happy 13.3%
Confused 10%
Disgusted 2.5%
Fear 1.5%
Angry 1.3%

AWS Rekognition

Age 33-41
Gender Male, 96%
Happy 94.2%
Surprised 4.2%
Sad 0.7%
Disgusted 0.2%
Fear 0.2%
Confused 0.2%
Angry 0.1%
Calm 0.1%

AWS Rekognition

Age 28-38
Gender Male, 99.8%
Sad 47.7%
Calm 29.8%
Happy 6.8%
Angry 4.9%
Surprised 4.4%
Disgusted 2.6%
Confused 2%
Fear 1.8%

AWS Rekognition

Age 38-46
Gender Female, 52.4%
Happy 88.3%
Surprised 6.1%
Calm 2.2%
Sad 1.3%
Fear 0.6%
Angry 0.6%
Disgusted 0.5%
Confused 0.5%

AWS Rekognition

Age 31-41
Gender Male, 93.6%
Happy 72%
Sad 9.7%
Calm 5.1%
Surprised 4.7%
Fear 3.6%
Confused 3%
Disgusted 1.2%
Angry 0.7%

AWS Rekognition

Age 47-53
Gender Female, 91.2%
Happy 60.2%
Calm 20.5%
Surprised 16.1%
Sad 1.4%
Disgusted 0.7%
Angry 0.5%
Confused 0.4%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Shoe 59%

Captions

Microsoft

a group of people posing for a photo 80.5%
a group of people sitting in front of a window 69.7%
a group of people in front of a window 69.6%

Text analysis

Amazon

VSJ
KODVK-SVEETA
$ 1351

Google

3
SSE
YT37A2-
AGON
3 SSE YT37A2- AGON