Human Generated Data

Title

Untitled (couple and child)

Date

1910s

People

Artist: Ross Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2175

Human Generated Data

Title

Untitled (couple and child)

People

Artist: Ross Studio, American 20th century

Date

1910s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.7
Human 99.7
Person 99.1
Person 98.7
Clothing 92.9
Apparel 92.9
Face 91.6
Art 85.2
Head 74
Costume 73.1
Photo 70.5
Photography 70.5
Portrait 70.5
People 65.7
Painting 63.4
Text 62.4
Sleeve 62.3
Drawing 58

Imagga
created on 2022-01-23

money 41.6
currency 40.3
cash 36.6
dollar 34.3
finance 33.8
financial 27.6
banking 27.6
business 27.3
bank 26.9
wealth 25.1
wall clock 24.9
bill 23.8
paper 22.8
dollars 21.2
clock 20.2
savings 19.6
pick 18.9
banknote 18.4
exchange 18.1
device 17.5
old 17.4
investment 16.5
hundred 16.4
one 16.4
rich 15.8
vintage 15.7
pay 15.3
timepiece 15
economy 14.8
us 14.4
retro 13.9
close 13.7
success 12.9
note 12.9
banknotes 12.7
finances 12.5
art 12.1
bills 11.7
symbol 11.4
funds 10.8
market 10.6
states 10.6
loan 10.5
profit 10.5
measuring instrument 10.4
compact disk 10.3
concepts 9.8
design 9.8
payment 9.6
black 9.6
commerce 9.3
container 9.3
phonograph record 9.1
texture 9
antique 8.6
notes 8.6
grunge 8.5
sign 8.3
world 8
franklin 7.9
president 7.8
wages 7.8
economics 7.8
value 7.8
tray 7.7
england 7.6
pattern 7.5
closeup 7.4
gold 7.4
object 7.3
china 7.3
color 7.2
face 7.1

Google
created on 2022-01-23

Coat 90.4
Sleeve 87.2
Gesture 85.3
Collar 80.2
Suit 80.1
Art 75.7
Rectangle 73
Vintage clothing 71.8
Formal wear 66.4
History 64.1
Event 63.1
Font 60.8
Room 59.7
Visual arts 57.3
Moustache 55.5
Portrait 55.2
Retro style 54.1
Paper product 52.5
Facial hair 52.1
Beard 51.1

Microsoft
created on 2022-01-23

human face 98.1
text 97.3
person 93.9
drawing 86.7
clothing 84.3
man 76.7
woman 67
posing 51.9

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 48-54
Gender Female, 58.5%
Confused 33.6%
Calm 24.8%
Disgusted 15.4%
Angry 10.2%
Happy 7%
Sad 3.4%
Surprised 2.9%
Fear 2.6%

AWS Rekognition

Age 43-51
Gender Male, 99.9%
Calm 94.4%
Surprised 1.5%
Fear 1.4%
Confused 1.3%
Angry 0.5%
Sad 0.4%
Disgusted 0.3%
Happy 0.2%

AWS Rekognition

Age 7-17
Gender Female, 100%
Calm 89.8%
Angry 5.7%
Confused 2.4%
Sad 0.6%
Surprised 0.5%
Happy 0.4%
Disgusted 0.3%
Fear 0.2%

Microsoft Cognitive Services

Age 9
Gender Female

Microsoft Cognitive Services

Age 57
Gender Female

Microsoft Cognitive Services

Age 53
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a close up of a box posing for the camera 60.6%
a close up of a box posing for a photo 56.1%
a person posing for a photo in front of a box 42.1%