Human Generated Data

Title

Untitled (many people seated inside city bus)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14773

Human Generated Data

Title

Untitled (many people seated inside city bus)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14773

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.7
Human 99.7
Person 99.5
Person 99.4
Person 98.6
Person 98.1
Clothing 95.5
Apparel 95.5
Steamer 92.5
Person 91.3
Tie 76.9
Accessories 76.9
Accessory 76.9
Shorts 76.8
Person 74.4
Face 72.4
Person 71.5
Crowd 70
People 69.6
Female 68.8
Outdoors 67.9
Transportation 66.5
Coat 66.2
Vehicle 65.7
Overcoat 63.5
Leisure Activities 62.6
Suit 61.8
Skin 60.5
Girl 60.3
Train 58.2
Monitor 56.6
Electronics 56.6
Screen 56.6
Display 56.6
Poster 55.8
Advertisement 55.8
Room 55.4
Indoors 55.4

Clarifai
created on 2023-10-28

people 99.7
group together 98.2
man 98
transportation system 97.1
train 96.6
group 96.4
railway 96.3
woman 95.1
locomotive 95
indoors 93.9
vehicle 91.7
adult 91.3
subway system 91.3
many 89.6
monochrome 82.7
watercraft 79.1
child 78.1
recreation 73.1
crowd 73
travel 72.7

Imagga
created on 2022-01-29

passenger 59.9
wagon 29.6
business 26.1
people 24
urban 21.8
train 20.6
wheeled vehicle 20.1
work 19.6
man 19.5
architecture 19.3
city 19.1
building 18.8
office 18.1
transportation 17.9
modern 16.8
station 16.7
transport 16.4
boutique 15.9
adult 15.5
corporate 14.6
men 14.6
interior 14.1
subway 13.8
lifestyle 13.7
smile 12.8
travel 12.7
women 12.6
attractive 12.6
crowd 12.5
portrait 12.3
hall 12.2
group 12.1
container 12
happy 11.9
pretty 11.9
businesswoman 11.8
job 11.5
businessman 11.5
male 11.3
vehicle 11.3
metal 11.3
company 11.2
industry 11.1
inside 11
person 10.9
worker 10.9
metro 10.8
walking 10.4
motion 10.3
construction 10.3
window 10
corridor 9.8
working 9.7
technology 9.6
black 9.6
manager 9.3
life 9
commuter 8.9
steel 8.8
railway 8.8
mall 8.8
public 8.7
glass 8.6
journey 8.5
center 8.3
success 8
indoors 7.9
railroad 7.9
professional 7.7
blurred 7.7
structure 7.6
engineering 7.6
meeting 7.5
executive 7.4
light 7.3
industrial 7.3
shop 7.3
looking 7.2
activity 7.2

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 94.5
person 93.7
clothing 89.6
man 77.4
woman 65.6
black and white 56.8
people 55.4
clothes 37.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Male, 100%
Happy 84.1%
Calm 12.8%
Surprised 2.4%
Disgusted 0.3%
Confused 0.2%
Angry 0.2%
Sad 0.1%
Fear 0.1%

AWS Rekognition

Age 25-35
Gender Male, 99.7%
Sad 43.6%
Calm 29.4%
Disgusted 14.9%
Confused 4.6%
Fear 3.9%
Surprised 1.4%
Angry 1.3%
Happy 1%

AWS Rekognition

Age 37-45
Gender Female, 99.2%
Happy 78.1%
Angry 11.7%
Calm 2.8%
Sad 2.5%
Fear 1.8%
Disgusted 1.4%
Confused 0.8%
Surprised 0.8%

AWS Rekognition

Age 49-57
Gender Female, 97.7%
Happy 92%
Calm 3%
Confused 2.4%
Sad 0.8%
Disgusted 0.7%
Surprised 0.4%
Fear 0.4%
Angry 0.3%

AWS Rekognition

Age 27-37
Gender Male, 94.2%
Sad 33.5%
Happy 17.4%
Fear 14.6%
Calm 13.4%
Confused 7.7%
Disgusted 4.8%
Surprised 4.7%
Angry 4%

AWS Rekognition

Age 40-48
Gender Male, 99.9%
Calm 99.6%
Sad 0.1%
Happy 0.1%
Fear 0.1%
Confused 0.1%
Angry 0%
Disgusted 0%
Surprised 0%

AWS Rekognition

Age 23-31
Gender Female, 57%
Calm 79.7%
Disgusted 9.3%
Surprised 3.4%
Happy 3.1%
Sad 1.4%
Confused 1.3%
Fear 1%
Angry 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Tie
Train
Person 99.7%
Person 99.5%
Person 99.4%
Person 98.6%
Person 98.1%
Person 91.3%
Person 74.4%
Person 71.5%
Tie 76.9%
Train 58.2%

Categories

Text analysis

Amazon

ED
RS
CARS
DRUG
BLDG. DRUG
USED
-
BLDG.
-PRESCRIPTIONS
NAGON
МОКЕУ» -PRESCRIPTIONS
a
МОКЕУ»
15

Google

HEMCRHOS WRAIT SEP Se ED RS OSED CARS
HEMCRHOS
WRAIT
SEP
Se
ED
RS
OSED
CARS