Human Generated Data

Title

Untitled (two women sitting and drinking tea)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14925

Human Generated Data

Title

Untitled (two women sitting and drinking tea)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Clothing 99.9
Apparel 99.9
Human 99.3
Person 99.3
Person 99.1
Person 98.8
Person 94.9
Female 85
Footwear 83.3
Person 79.5
Person 77.2
Flooring 76.2
Shorts 74.2
Woman 73.6
Floor 71.9
Shoe 70.9
Person 70
Skirt 67
Overcoat 60.9
Coat 60.9
Suit 58.7
Sleeve 56
Person 54.8
Person 43.9

Imagga
created on 2022-01-29

man 30.9
people 29.6
person 28.5
shop 27.2
adult 26.8
teacher 23.4
barbershop 22.3
men 22.3
professional 21.8
business 20
male 19.9
hairdresser 19.6
women 18.2
office 18.1
businessman 17.7
indoors 17.6
happy 17.5
educator 16.8
mercantile establishment 14.9
room 14.6
two 14.4
portrait 14.2
salon 14
family 13.3
modern 13.3
interior 13.3
blackboard 13.2
home 12.8
pretty 12.6
happiness 12.5
fashion 12.1
corporate 12
indoor 11.9
communication 11.8
window 11.7
smiling 11.6
standing 11.3
group 11.3
looking 11.2
casual 11
lifestyle 10.8
black 10.8
attractive 10.5
urban 10.5
couple 10.5
work 10.2
place of business 10.1
house 10
chair 9.6
meeting 9.4
cleaner 9.3
team 9
sitting 8.6
smile 8.5
building 8.5
youth 8.5
city 8.3
inside 8.3
back 8.3
human 8.2
alone 8.2
board 8.1
sexy 8
clothing 8
child 7.9
together 7.9
glass 7.8
travel 7.7
hand 7.6
career 7.6
manager 7.4
life 7.4
style 7.4
teamwork 7.4
phone 7.4
student 7.4
lady 7.3
girls 7.3
new 7.3
businesswoman 7.3
worker 7.2
cute 7.2
handsome 7.1
job 7.1
working 7.1

Google
created on 2022-01-29

Footwear 98.1
Shoe 95.3
Leg 91
Black-and-white 85.4
Style 83.9
Knee 75.2
Font 74.6
Monochrome photography 73.6
Picture frame 73
Monochrome 72.7
Hat 72.6
Event 71.5
Art 71.4
Boot 70.7
Fashion design 67.7
Street fashion 66.5
Room 66.2
Vintage clothing 64.8
Stock photography 64.7
Sitting 62.3

Microsoft
created on 2022-01-29

clothing 98.3
text 97.5
person 94.6
footwear 93.1
black and white 85.3
woman 83.9
dress 82.1
street 70.4
man 58.4

Face analysis

Amazon

AWS Rekognition

Age 52-60
Gender Male, 96.3%
Calm 89.3%
Sad 4.9%
Confused 3.1%
Surprised 0.9%
Angry 0.6%
Happy 0.5%
Fear 0.4%
Disgusted 0.4%

AWS Rekognition

Age 24-34
Gender Male, 100%
Surprised 64.3%
Calm 25.6%
Disgusted 3.9%
Sad 2.1%
Confused 1.4%
Angry 1.4%
Fear 0.8%
Happy 0.6%

AWS Rekognition

Age 25-35
Gender Male, 99.7%
Surprised 80.4%
Calm 16.6%
Disgusted 0.7%
Sad 0.6%
Angry 0.6%
Happy 0.6%
Fear 0.4%
Confused 0.1%

AWS Rekognition

Age 28-38
Gender Male, 95.2%
Calm 96.9%
Disgusted 1%
Confused 1%
Happy 0.6%
Angry 0.2%
Surprised 0.2%
Sad 0.1%
Fear 0%

AWS Rekognition

Age 23-33
Gender Male, 99.9%
Calm 79.5%
Surprised 17.1%
Happy 0.9%
Angry 0.6%
Sad 0.6%
Disgusted 0.6%
Confused 0.5%
Fear 0.3%

Feature analysis

Amazon

Person 99.3%
Shoe 70.9%

Captions

Microsoft

a group of people walking down a street next to a window 82.3%
a group of people standing next to a window 74.9%
a group of people standing in front of a window 74.1%