Human Generated Data

Title

Untitled (three young women)

Date

c. 1920

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.780

Human Generated Data

Title

Untitled (three young women)

People

Artist: Durette Studio, American 20th century

Date

c. 1920

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 99.7
Person 99.7
Person 99.4
Person 98.6
Head 94.9
Face 89.4
Art 86
Text 72.1
Painting 71
Advertisement 68.7
Portrait 67.9
Photography 67.9
Photo 67.9
Poster 66.4
Female 60.8
Clothing 59
Apparel 59
Sculpture 58.6
Collage 56.5

Imagga
created on 2021-12-14

television 93.8
telecommunication system 55.5
monitor 23.2
money 20.4
screen 19.5
business 18.8
cash 18.3
currency 17.9
broadcasting 17.1
one 16.4
financial 16
dollar 15.8
black 14.4
display 14
banking 13.8
office 13.6
portrait 13.6
liquid crystal display 13.5
savings 13
symbol 12.8
technology 12.6
paper 12.5
telecommunication 12.3
finance 11.8
wealth 11.7
face 11.4
art 11.3
people 11.1
culture 11.1
man 10.7
web site 10.6
laptop 10.3
close 10.3
background 10
computer 10
vintage 9.9
bank 9.8
banknotes 9.8
dollars 9.7
us 9.6
finances 9.6
design 9.6
post 9.5
person 9.5
electronic equipment 9.4
happy 9.4
rich 9.3
investment 9.2
global 9.1
equipment 9.1
child 9
masterpiece 8.9
known 8.9
paintings 8.8
smiling 8.7
flat 8.7
pay 8.6
loan 8.6
communications 8.6
painted 8.6
fine 8.6
bill 8.6
smile 8.5
unique 8.5
male 8.5
object 8.1
hair 7.9
post mail 7.9
zigzag 7.9
fame 7.9
postmark 7.9
shows 7.9
printed 7.9
postage 7.9
renaissance 7.9
postal 7.8
painter 7.8
envelope 7.8
delivery 7.8
bills 7.8
banknote 7.8
museum 7.8
hundred 7.7
stamp 7.7
cutting 7.7
modern 7.7
mail 7.7
exchange 7.6
medium 7.5
electronic 7.5
economy 7.4
closeup 7.4
single 7.4
billboard 7.4
letter 7.3
color 7.2
icon 7.1
love 7.1
work 7.1
reflection 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

human face 99.5
person 97.9
clothing 97
smile 96.3
woman 96
indoor 89.8
text 87.1
posing 69
portrait 65
old 46.2
picture frame 34.5

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 13-25
Gender Female, 99%
Happy 97.1%
Confused 0.6%
Surprised 0.5%
Disgusted 0.5%
Angry 0.5%
Calm 0.4%
Fear 0.3%
Sad 0.1%

AWS Rekognition

Age 22-34
Gender Female, 80.7%
Disgusted 45.4%
Calm 18.5%
Happy 13%
Angry 11.8%
Sad 4.3%
Confused 3.7%
Fear 2%
Surprised 1.2%

AWS Rekognition

Age 22-34
Gender Female, 99.1%
Happy 98.3%
Confused 0.5%
Fear 0.3%
Angry 0.3%
Disgusted 0.2%
Surprised 0.2%
Calm 0.1%
Sad 0.1%

Microsoft Cognitive Services

Age 28
Gender Female

Microsoft Cognitive Services

Age 44
Gender Female

Microsoft Cognitive Services

Age 27
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Painting 71%

Captions

Microsoft

a group of people standing in front of a mirror posing for the camera 77.4%
a group of people in front of a mirror posing for the camera 77.3%
a person standing in front of a mirror posing for the camera 75.8%

Text analysis

Amazon

MANCHESTER,N.H.
STUDIO
DURETTE STUDIO
DURETTE

Google

DURETTE&TUDIO MANCHESTER,N.H.
MANCHESTER,N.H.
DURETTE&TUDIO