Human Generated Data

Title

Untitled (elderly women and couple sitting in restaurant booths)

Date

1961

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9856

Human Generated Data

Title

Untitled (elderly women and couple sitting in restaurant booths)

People

Artist: Martin Schweig, American 20th century

Date

1961

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Human 99.7
Person 99.7
Person 94.3
Room 92.8
Indoors 92.8
Person 91.7
Person 86.8
Art 73.9
Interior Design 72.5
Clinic 72
Furniture 70.5
Drawing 57.3

Imagga
created on 2022-01-28

business 30.4
people 29.6
person 27.6
businessman 26.5
man 26.2
male 24.2
men 24
group 23.4
team 22.4
silhouette 21.5
adult 20.6
negative 18.5
film 18.1
professional 18
corporate 17.2
happy 16.3
office 16.1
work 14.9
worker 14.2
job 14.2
working 14.1
businesswoman 13.6
teamwork 13
human 12.7
women 12.7
portrait 12.3
black 12
photographic paper 11.7
family 11.6
businesspeople 11.4
career 11.4
meeting 11.3
executive 11.3
home 11.2
suit 11
modern 10.5
life 10.5
couple 10.5
manager 10.2
communication 10.1
house 10
together 9.6
employee 9.6
successful 9.1
new 8.9
success 8.8
assistant 8.7
colleagues 8.7
world 8.7
cooperation 8.7
wind instrument 8.4
company 8.4
sport 8.2
confident 8.2
clothing 8.1
room 7.9
smile 7.8
photographic equipment 7.8
teacher 7.7
boss 7.6
brass 7.6
fashion 7.5
holding 7.4
light 7.4
design 7.3
smiling 7.2
musical instrument 7.2
cornet 7.2
looking 7.2
reflection 7.1
happiness 7
indoors 7

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

drawing 97.5
wall 97.1
indoor 93.4
sketch 91.1
text 88.3
black and white 84.3
clothing 77
person 74.1
white 72.4
cartoon 70.9
room 47

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Male, 98.9%
Calm 48.7%
Sad 15%
Fear 9.3%
Happy 8.4%
Disgusted 8.1%
Surprised 4.2%
Confused 3.5%
Angry 2.6%

AWS Rekognition

Age 45-53
Gender Male, 99.5%
Calm 99.9%
Sad 0%
Disgusted 0%
Fear 0%
Happy 0%
Surprised 0%
Confused 0%
Angry 0%

AWS Rekognition

Age 47-53
Gender Male, 58.2%
Calm 62.2%
Fear 13.2%
Sad 8.6%
Confused 8.2%
Surprised 2.3%
Happy 2.3%
Disgusted 1.6%
Angry 1.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a group of men standing in front of a mirror posing for the camera 47.9%
a group of men standing in front of a mirror 47.8%
a person standing in front of a mirror posing for the camera 47.7%

Text analysis

Amazon

Forks