Human Generated Data

Title

Untitled (elderly man sitting on city bench)

Date

1961

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16099

Human Generated Data

Title

Untitled (elderly man sitting on city bench)

People

Artist: Jack Gould, American

Date

1961

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.8
Human 99.8
Person 99.8
Person 99.8
Person 99.7
Person 99
Person 98.9
Pedestrian 98.1
Person 94.1
Path 91.5
Furniture 89.9
Shoe 82.4
Footwear 82.4
Clothing 82.4
Apparel 82.4
Coat 70.9
Overcoat 70.9
Sidewalk 69.4
Pavement 69.4
Bench 68.1
People 60.1
Shoe 56.3

Imagga
created on 2022-02-05

engineer 28.5
man 24.2
percussion instrument 22.1
people 20.6
male 20.6
building 20.3
musical instrument 19.5
architecture 19
city 18.3
silhouette 18.2
tripod 18.1
travel 17.6
sky 16.6
television camera 15.7
rack 14.4
person 13.9
urban 13.1
support 12.9
marimba 12.9
business 12.7
television equipment 12.6
old 12.5
adult 12.3
vibraphone 12.1
device 11.5
black 10.3
electronic equipment 10
worker 9.8
buildings 9.4
work 9.4
window 9.2
tourism 9.1
landmark 9
photographer 9
equipment 9
couple 8.7
house 8.4
alone 8.2
one 8.2
vacation 8.2
square 8.1
office 8.1
water 8
tool 7.9
art 7.8
glass 7.8
men 7.7
roof 7.6
outdoors 7.5
landscape 7.4
town 7.4
street 7.4
light 7.3
historic 7.3
reflection 7.3
rake 7.3
life 7.2
suit 7.2
religion 7.2
tower 7.2

Google
created on 2022-02-05

Trousers 96.3
Building 93.9
Sky 92
Travel 82.4
Line 82.3
Tints and shades 77.1
Urban design 75.8
Road 75.5
City 74.1
Tree 72.8
Street fashion 72.2
Pedestrian 70.3
Sidewalk 68.8
Road surface 67.4
Street 67.1
Leisure 64.9
Metal 62.8
Stock photography 62.2
Sitting 61.9
Musician 61.7

Microsoft
created on 2022-02-05

outdoor 97.7
person 97.2
clothing 95.5
man 90.4
street 89.7
footwear 82.8
city 78.4
skyscraper 69.3
text 66.9
woman 61.1

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Male, 96.5%
Sad 68.7%
Calm 8.2%
Fear 6.6%
Confused 4%
Happy 3.6%
Angry 3.2%
Surprised 2.9%
Disgusted 2.8%

AWS Rekognition

Age 35-43
Gender Female, 75.4%
Happy 73%
Calm 20%
Fear 2.5%
Sad 1.3%
Angry 1.2%
Surprised 1%
Disgusted 0.7%
Confused 0.4%

AWS Rekognition

Age 13-21
Gender Female, 80.4%
Sad 67.8%
Calm 23.9%
Disgusted 2.5%
Fear 1.9%
Happy 1.4%
Angry 1%
Surprised 0.9%
Confused 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 82.4%

Captions

Microsoft

a group of people sitting on a bench 55.5%
a group of people sitting on a bench in front of a building 52%
a group of people that are sitting on a bench 49.5%

Text analysis

Amazon

LOUIS
LOANS
ROCK
AVE
ROCK H
1 LOUIS AVE
H
BENEFICIAL
1

Google

ROCK
LOUIS
ST
LOANS ROCK ST LOUIS AVE
LOANS
AVE