Human Generated Data

Title

Untitled (parents and two small children posed on stairs in church near mantel lined with candles)

Date

1954

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9509

Human Generated Data

Title

Untitled (parents and two small children posed on stairs in church near mantel lined with candles)

People

Artist: Martin Schweig, American 20th century

Date

1954

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Clothing 99.7
Apparel 99.7
Person 98.7
Human 98.7
Helmet 98.5
Person 97.8
Dress 97.7
Person 97.1
Person 96.8
Female 92.1
Sunglasses 82.6
Accessory 82.6
Accessories 82.6
Woman 79.2
Suit 76.8
Overcoat 76.8
Coat 76.8
Furniture 73.7
Chair 73.7
Floor 73.6
Teen 70.8
Girl 70.8
Blonde 70.8
Kid 70.8
Child 70.8
Face 66.8
Gown 65.3
Evening Dress 65.3
Robe 65.3
Fashion 65.3
People 64.7
Sitting 62.9
Sleeve 61.3
Leisure Activities 60.2
Shoe 58.7
Footwear 58.7
Flooring 58.4
Door 57.7
Long Sleeve 56.2

Imagga
created on 2022-01-28

wheelchair 29.7
man 29.6
people 27.9
chair 26.1
kin 22
adult 21.9
person 21.1
male 20.7
city 19.1
seat 17.9
business 17
work 16.5
men 16.3
urban 15.7
working 14.1
lifestyle 13.7
women 13.4
job 13.3
businessman 13.2
street 12.9
black 12.6
office 12.2
smile 12.1
building 11.9
businesswoman 11.8
worker 11.6
outdoors 11.4
happy 11.3
furniture 11
bag 10.9
outdoor 10.7
bench 10.5
professional 10.4
sitting 10.3
youth 10.2
transportation 9.9
couple 9.6
smiling 9.4
two 9.3
travel 9.2
park 9.1
fashion 9
pedestrian 9
group 8.9
life 8.7
cold 8.6
corporate 8.6
walking 8.5
casual 8.5
portrait 8.4
road 8.1
computer 8
pretty 7.7
attractive 7.7
old 7.7
hand 7.6
communication 7.6
hairdresser 7.5
friendship 7.5
one 7.5
occupation 7.3
transport 7.3
alone 7.3
looking 7.2
school 7.2
executive 7

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

person 96.5
text 94.6
street 93.7
statue 92.9
clothing 91.5
outdoor 85.1
black and white 84.7
woman 67.2
monochrome 57.5
footwear 52.9

Face analysis

Amazon

AWS Rekognition

Age 31-41
Gender Female, 94.6%
Calm 96.1%
Surprised 1.6%
Sad 1.3%
Disgusted 0.4%
Fear 0.2%
Happy 0.2%
Angry 0.1%
Confused 0.1%

AWS Rekognition

Age 48-56
Gender Male, 94.6%
Happy 95.9%
Confused 1.1%
Disgusted 1.1%
Calm 0.6%
Sad 0.4%
Fear 0.4%
Angry 0.3%
Surprised 0.3%

AWS Rekognition

Age 9-17
Gender Male, 95.8%
Calm 66.9%
Surprised 17.3%
Fear 7.8%
Sad 3.5%
Happy 2.2%
Angry 1.1%
Disgusted 0.9%
Confused 0.4%

AWS Rekognition

Age 45-53
Gender Male, 87.6%
Happy 53.9%
Calm 39.4%
Sad 2.7%
Confused 1.5%
Surprised 0.9%
Disgusted 0.8%
Angry 0.4%
Fear 0.4%

Feature analysis

Amazon

Person 98.7%
Helmet 98.5%
Sunglasses 82.6%

Captions

Microsoft

a man and a woman sitting on a bench 61.4%
a man and woman sitting on a bench 55.8%
a man and a woman sitting on a bench in front of a building 55.7%

Text analysis

Amazon

11500