Human Generated Data

Title

Untitled (two women in bowler hats posed sitting together at New Year's party)

Date

1952

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9425

Human Generated Data

Title

Untitled (two women in bowler hats posed sitting together at New Year's party)

People

Artist: Martin Schweig, American 20th century

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Apparel 99.4
Clothing 99.4
Human 95.8
Person 95.8
Person 95.8
Furniture 94.5
Person 94.4
Sitting 92.8
Floor 87
Flooring 82.5
Chair 76.4
Table 73.3
Female 73
Photography 67.1
Photo 67.1
Portrait 65.5
Face 65.5
Suit 62.8
Coat 62.8
Overcoat 62.8
Room 58.1
Indoors 58.1
Woman 57.6
Girl 55.9

Imagga
created on 2022-01-23

chair 51.3
rocking chair 31.8
seat 30.2
interior 27.4
people 25.1
man 24.9
table 24.8
furniture 23.2
business 22.5
office 22
room 20.7
male 19.9
patio 19.7
person 18.9
window 18.5
balcony 18.3
modern 18.2
work 16.5
glass 16.3
sitting 16.3
lifestyle 15.9
structure 15.8
adult 15.5
indoor 15.5
indoors 14.9
floor 14.9
architecture 14.8
women 14.2
businessman 14.1
urban 13.1
group 12.9
musical instrument 12.3
worker 12.2
hall 12.2
home 12
area 11.6
percussion instrument 11.5
building 11.5
meeting 11.3
happy 11.3
men 11.2
inside 11
smiling 10.9
team 10.7
corporate 10.3
teacher 10.1
communication 10.1
house 10
city 10
silhouette 9.9
airport 9.8
classroom 9.8
door 9.7
restaurant 9.3
relaxation 9.2
board 9.1
black 9
sliding door 8.9
furnishing 8.9
steel 8.8
working 8.8
full length 8.7
wall 8.7
empty 8.6
study 8.4
wood 8.3
holding 8.3
desk 8.2
cheerful 8.1
professional 8
job 8
20 24 years 7.9
corridor 7.9
day 7.8
leisure activity 7.8
smile 7.8
chairs 7.8
education 7.8
teamwork 7.4
computer 7.3
businesswoman 7.3
happiness 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

furniture 97.1
table 96.3
black and white 91.8
chair 88.9
text 67.3
house 59.6
person 52.4

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Male, 99.7%
Sad 54.8%
Happy 41.4%
Confused 1.2%
Surprised 0.8%
Disgusted 0.6%
Calm 0.5%
Fear 0.4%
Angry 0.3%

AWS Rekognition

Age 36-44
Gender Male, 99.8%
Sad 32%
Disgusted 17.2%
Happy 15.7%
Confused 15.6%
Calm 11.8%
Surprised 4%
Angry 2.2%
Fear 1.6%

AWS Rekognition

Age 48-56
Gender Male, 65.1%
Sad 60.8%
Calm 16.1%
Confused 12.1%
Fear 3.9%
Happy 2.8%
Disgusted 2.4%
Angry 1.3%
Surprised 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 95.8%
Chair 76.4%

Captions

Microsoft

a person sitting on a bench in front of a window 66.3%
a person sitting on a bench in front of a window 66.2%
a person sitting in front of a window 66.1%

Text analysis

Amazon

80A
KODVK-EELA
<<<<
0 <<<<
0