Human Generated Data

Title

Untitled (family of seven standing outside building between two tall trees, woman holds baby)

Date

c. 1935

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13020

Human Generated Data

Title

Untitled (family of seven standing outside building between two tall trees, woman holds baby)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

c. 1935

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Clothing 99.9
Apparel 99.9
Human 99.5
Person 99.5
Person 99.1
Person 98.9
Person 98.5
Person 98.2
Coat 95.9
Person 91.5
Shorts 91.4
Female 90.8
Shoe 83.8
Footwear 83.8
Overcoat 79.8
Suit 79.8
Woman 78.4
People 78.4
Shoe 78.2
Skirt 72.2
Shoe 69.5
Hat 68.6
Dress 65.9
Sun Hat 64.5
Accessories 63.7
Accessory 63.7
Tie 63.7
Evening Dress 61.4
Fashion 61.4
Robe 61.4
Gown 61.4
Military 57.3
Military Uniform 57.3
Officer 57.3
Sleeve 57
Standing 55.3
Shoe 51.5

Imagga
created on 2022-02-05

metropolitan 70.1
architecture 26.7
old 26.5
building 25.5
kin 22.1
church 18.5
street 18.4
city 18.3
people 17.9
stone 17.8
tourism 17.3
history 15.2
ancient 14.7
arch 14.6
religion 14.3
travel 14.1
monument 14
male 13.5
art 13
groom 12.9
tourist 12.2
culture 11.1
historic 11
columns 10.8
man 10.8
historical 10.3
men 10.3
wall 10.3
famous 10.2
palace 9.6
column 9.6
window 9.6
color 9.5
catholic 9.4
town 9.3
landmark 9
night 8.9
adult 8.7
scene 8.7
cathedral 8.6
sculpture 8.6
business 8.5
black 8.5
house 8.4
outdoors 8.2
statue 7.9
world 7.9
urban 7.9
antique 7.8
facade 7.7
exterior 7.4
group 7.3
women 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

clothing 95.7
text 91.7
person 91.2
woman 88.2
dress 87.8
standing 78.3
group 76.4
black and white 75.7
people 59.8
man 59.4
posing 45.5

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 95.2%
Confused 69.6%
Sad 16.6%
Happy 10.6%
Surprised 1.1%
Calm 0.7%
Disgusted 0.6%
Fear 0.5%
Angry 0.4%

AWS Rekognition

Age 14-22
Gender Male, 99.8%
Calm 96.5%
Sad 1.8%
Angry 0.4%
Disgusted 0.4%
Surprised 0.3%
Confused 0.2%
Happy 0.2%
Fear 0.1%

AWS Rekognition

Age 20-28
Gender Male, 99.3%
Calm 77.1%
Happy 16.1%
Sad 3.5%
Surprised 0.9%
Angry 0.9%
Disgusted 0.9%
Confused 0.5%
Fear 0.1%

AWS Rekognition

Age 42-50
Gender Male, 99.9%
Happy 50.7%
Calm 25.5%
Sad 11.7%
Fear 4%
Confused 2.4%
Disgusted 2.4%
Surprised 2%
Angry 1.3%

AWS Rekognition

Age 31-41
Gender Male, 99.3%
Calm 99.9%
Happy 0%
Sad 0%
Confused 0%
Surprised 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 25-35
Gender Male, 99.8%
Calm 77.7%
Sad 10.2%
Surprised 2.9%
Confused 2.9%
Fear 2.7%
Angry 1.3%
Disgusted 1.2%
Happy 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Coat 95.9%
Shoe 83.8%
Tie 63.7%

Captions

Microsoft

a group of people posing for a photo 96.9%
a group of people posing for a picture 96.8%
a group of people posing for the camera 96.7%

Text analysis

Amazon

SI

Google

YAGON-YT3HA2-
NAMT2A3
YAGON-YT3HA2- NAMT2A3