Human Generated Data

Title

Untitled (elderly man sitting on city bench, head down)

Date

1961

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16098.1

Human Generated Data

Title

Untitled (elderly man sitting on city bench, head down)

People

Artist: Jack Gould, American

Date

1961

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16098.1

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Person 99.8
Human 99.8
Person 99.5
Person 99.3
Person 98.6
Pedestrian 95.3
Person 93.4
Clothing 83.9
Apparel 83.9
Path 83.6
Flagstone 78.3
Handrail 78
Banister 78
Shoe 69.8
Footwear 69.8
Sidewalk 65.4
Pavement 65.4
Overcoat 62.1
Coat 62.1
Pants 62.1
Spire 57.6
Building 57.6
Architecture 57.6
Tower 57.6
Steeple 57.6
Tarmac 56.9
Asphalt 56.9

Clarifai
created on 2023-10-29

people 99.5
street 99.4
city 99.1
urban 96.4
man 96.4
bench 95
group 92.3
adult 92.3
girl 89.7
portrait 89.5
woman 89.2
group together 87.9
school 86.1
road 85.2
editorial 83.8
appoint 83.8
couple 83.5
building 80.3
two 80.1
education 79.4

Imagga
created on 2022-02-11

television camera 30.2
city 27.4
television equipment 24.1
architecture 22.9
people 19
building 18.5
electronic equipment 18.3
travel 18.3
man 17.5
sky 17.2
urban 15.7
landmark 14.4
equipment 13.9
buildings 13.2
tourism 13.2
street 12.9
outdoor 12.2
monument 11.2
business 10.9
male 10.6
cityscape 10.4
clouds 10.1
engineer 9.9
old 9.7
support 9.3
photographer 9.2
black 9.2
road 9
tower 8.9
worker 8.9
person 8.7
water 8.7
sidewalk 8.5
tripod 8.5
park 8.4
stone 8.4
adult 8.4
structure 8.4
historic 8.2
vacation 8.2
square 8.1
day 7.8
statue 7.8
summer 7.7
tourist 7.7
england 7.6
united 7.6
device 7.5
outdoors 7.5
silhouette 7.4
church 7.4
barrier 7.2
holiday 7.2
love 7.1
work 7.1

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

outdoor 99.4
ground 96.5
clothing 94
man 88.5
person 86.9
city 73.5
street 68.5
park 66.9
text 62.5
sky 52.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Female, 99.9%
Sad 69.4%
Calm 8.6%
Disgusted 7.8%
Angry 7.3%
Happy 3.5%
Surprised 1.2%
Fear 1.2%
Confused 0.9%

AWS Rekognition

Age 36-44
Gender Female, 89.7%
Angry 72.5%
Happy 12.5%
Fear 8.2%
Surprised 2.6%
Calm 1.6%
Sad 1.3%
Confused 0.7%
Disgusted 0.5%

AWS Rekognition

Age 23-33
Gender Male, 95.1%
Happy 58.8%
Calm 20.6%
Fear 13.2%
Surprised 1.8%
Angry 1.8%
Sad 1.6%
Disgusted 1.5%
Confused 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.8%
Person 99.5%
Person 99.3%
Person 98.6%
Person 93.4%
Shoe 69.8%

Categories

Text analysis

Amazon

LOANS
7E
BENEFICIAL
AVE
OUIS AVE
OUIS
KODYK
KODYK tirn
tirn

Google

MJI3YT37 A2 ZAGON 7E
MJI3YT37
A2
ZAGON
7E