Human Generated Data

Title

Untitled (people in old-fashioned clothes on city street, backs of women)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16066.3

Human Generated Data

Title

Untitled (people in old-fashioned clothes on city street, backs of women)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Person 99.7
Human 99.7
Person 99.4
Person 99.2
Person 99.1
Person 98.5
Person 97.8
Person 97.3
Person 96.1
Person 94.3
Clothing 93.7
Apparel 93.7
Person 91.1
Person 86.7
Meal 74.9
Food 74.9
Shorts 70
Person 69.2
People 63.9
Female 63.6
Portrait 61.8
Photo 61.8
Photography 61.8
Face 61.8
Person 60.6
Crowd 59
Text 55.1

Imagga
created on 2022-02-11

barbershop 100
shop 100
mercantile establishment 77.5
place of business 51.7
establishment 25.8
people 24
man 20.8
home 19.9
newspaper 18.9
shoe shop 16.3
interior 15.9
room 15.8
window 15.8
male 15.6
person 14.8
house 14.2
product 13.9
office 13.8
men 13.7
adult 13.6
business 13.3
indoors 13.2
family 11.6
working 11.5
women 11.1
fashion 10.5
old 10.4
decoration 10.2
smiling 10.1
creation 10.1
worker 10
portrait 9.7
work 9.4
mother 9.4
happy 9.4
building 9.3
indoor 9.1
black 9
together 8.8
architecture 8.6
sitting 8.6
professional 8.5
two 8.5
mature 8.4
vintage 8.3
style 8.2
team 8.1
group 8.1
chair 8
smile 7.8
couple 7.8
ancient 7.8
glass 7.8
modern 7.7
health 7.6
doctor 7.5
senior 7.5
city 7.5
salon 7.3
dress 7.2
religion 7.2
medical 7.1
happiness 7
hospital 7

Microsoft
created on 2022-02-11

text 99.1
person 98.1
clothing 84.5
people 76.9
group 74
old 62.4
woman 57.5
clothes 23.2
several 10.4

Face analysis

Amazon

Google

AWS Rekognition

Age 53-61
Gender Male, 98.6%
Calm 97.6%
Sad 2%
Disgusted 0.1%
Confused 0.1%
Happy 0.1%
Angry 0.1%
Surprised 0%
Fear 0%

AWS Rekognition

Age 24-34
Gender Male, 54.2%
Calm 75.1%
Sad 23.3%
Confused 0.4%
Angry 0.3%
Fear 0.3%
Disgusted 0.2%
Surprised 0.2%
Happy 0.2%

AWS Rekognition

Age 26-36
Gender Female, 81.1%
Sad 72.2%
Calm 19.5%
Fear 4.5%
Happy 2.1%
Angry 0.7%
Confused 0.4%
Disgusted 0.3%
Surprised 0.1%

AWS Rekognition

Age 25-35
Gender Male, 99.7%
Calm 99.8%
Sad 0.1%
Disgusted 0%
Happy 0%
Confused 0%
Angry 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 35-43
Gender Male, 92.4%
Calm 99.4%
Sad 0.2%
Happy 0.2%
Disgusted 0.1%
Angry 0.1%
Surprised 0%
Confused 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a group of people posing for a photo 73.6%
a group of people posing for a picture 73.5%
a group of people posing for the camera 73.4%

Text analysis

Amazon

BANK
KS
BANN
STR
UDHV - SEA

Google

TENCTENTRANG
BANK BANN TENCTENTRANG
BANK
BANN