Human Generated Data

Title

Untitled (parents with daughter looking at portrait of woman, gathered in living room in dress clothes)

Date

c. 1940

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13109

Human Generated Data

Title

Untitled (parents with daughter looking at portrait of woman, gathered in living room in dress clothes)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 98.6
Human 98.6
Performer 98.2
Dance Pose 97.9
Leisure Activities 97.9
Person 95.2
Person 95.1
Clothing 82.3
Apparel 82.3
Dance 80.6
Face 78.2
Flamenco 73.8
Person 69.8
Portrait 67.6
Photography 67.6
Photo 67.6
Indoors 63.8
Room 63.8
Female 62.1
Suit 60.9
Coat 60.9
Overcoat 60.9
Living Room 57.8

Imagga
created on 2022-01-29

person 26.4
people 26.2
office 25.3
adult 24.6
newspaper 24.3
man 23.5
business 23.1
indoors 20.2
product 20.2
computer 20.1
home 19.9
creation 17.9
laptop 17.7
working 17.7
businesswoman 17.3
male 17.1
negative 16.9
room 16.6
indoor 16.4
portrait 16.2
attractive 16.1
businessman 15.9
professional 15.7
lifestyle 15.2
technology 14.8
smiling 14.5
work 14.1
happy 13.8
looking 13.6
film 13.3
pretty 13.3
black 13.2
shop 13.2
sitting 12.9
house 11.7
job 11.5
corporate 11.2
women 11.1
smile 10.7
face 10.6
businesspeople 10.4
monitor 10.4
executive 10.2
casual 10.2
back 10.1
communication 10.1
alone 10
interior 9.7
window 9.7
screen 9.6
clothing 9.6
photographic paper 9.5
one person 9.4
20s 9.2
modern 9.1
confident 9.1
holding 9.1
worker 8.9
cheerful 8.9
group 8.9
desk 8.9
talking 8.6
horizontal 8.4
phone 8.3
one 8.2
lady 8.1
sexy 8
men 7.7
musical instrument 7.6
photograph 7.6
females 7.6
barbershop 7.6
mercantile establishment 7.5
mature 7.4
girls 7.3
mug shot 7.3
student 7.2
body 7.2
team 7.2
family 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 97.6
person 95.8
black and white 85
clothing 74.6
art 53.4

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Male, 99.3%
Happy 74.7%
Sad 15.7%
Calm 4.6%
Confused 2.5%
Disgusted 0.7%
Fear 0.7%
Surprised 0.6%
Angry 0.5%

AWS Rekognition

Age 28-38
Gender Female, 99.6%
Calm 35.8%
Surprised 32%
Angry 11.1%
Happy 6.8%
Sad 5.8%
Disgusted 4.2%
Confused 3.4%
Fear 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.6%

Captions

Microsoft

a man and a woman standing in front of a computer 71.4%
a person standing in front of a computer 71.3%
a person talking on a cell phone 41.1%

Text analysis

Amazon

aa
YT33A2
324RJ
YT33A2 KAGOK
KAGOK