Human Generated Data

Title

Residents of Columbus, Ohio

Date

1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3006

Human Generated Data

Title

Residents of Columbus, Ohio

People

Artist: Ben Shahn, American 1898 - 1969

Date

1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3006

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.8
Human 99.8
Person 99.5
Footwear 99
Apparel 99
Shoe 99
Clothing 99
Hat 98.3
Person 96.8
Shoe 96.4
Person 94.8
Stick 87.3
Tarmac 85.6
Asphalt 85.6
Cane 79
Road 71.4
Path 70.3
Sitting 56.9

Clarifai
created on 2023-10-15

people 99.8
elderly 98.7
two 98.2
lid 98.1
man 97.7
three 97.5
adult 97.2
monochrome 97
portrait 95.9
street 95.3
veil 95.2
group together 92.8
group 92.7
four 89.1
wear 87.4
documentary 87.4
walking stick 84.4
seat 82.2
old 82.1
chair 81.9

Imagga
created on 2021-12-15

man 30.9
people 25.1
hat 23.2
male 22.7
person 22.3
cowboy hat 21.6
sky 20.4
beach 20.2
water 19.3
travel 18.3
sea 18
ocean 16.6
sunset 16.2
summer 15.4
vacation 14.7
clothing 14.5
outdoor 14.5
sun 14.5
couple 13.9
headdress 13.7
adult 13.6
sand 13.3
world 12.6
walking 12.3
men 12
coast 11.7
silhouette 11.6
lifestyle 11.6
building 11.4
crutch 11.3
outdoors 11.2
landscape 11.2
old 11.1
love 11
relax 10.9
architecture 10.9
life 10.7
tourism 10.7
two 10.2
scholar 9.7
working 9.7
together 9.6
walk 9.5
work 9.4
happy 9.4
industry 9.4
waves 9.3
environment 9
staff 8.9
family 8.9
job 8.8
seller 8.5
stone 8.4
black 8.4
father 8.4
lady 8.1
child 8
business 7.9
holiday 7.9
intellectual 7.8
sitting 7.7
outside 7.7
newspaper 7.7
senior 7.5
leisure 7.5
shore 7.4
worker 7.4
safety 7.4
occupation 7.3
peaceful 7.3
alone 7.3
tourist 7.2
looking 7.2
portrait 7.1
day 7.1
parent 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 99.7
outdoor 99.6
clothing 99
ground 98.4
person 94.7
man 93.4
hat 87
footwear 78.4
black and white 77.8
sidewalk 73.8
fashion accessory 71.3
smile 60.7
street 54.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-55
Gender Male, 95.8%
Calm 86.7%
Angry 5%
Sad 4.3%
Happy 2.1%
Disgusted 0.6%
Surprised 0.5%
Fear 0.4%
Confused 0.4%

AWS Rekognition

Age 53-71
Gender Male, 96.2%
Calm 99.1%
Surprised 0.5%
Happy 0.1%
Angry 0.1%
Sad 0.1%
Fear 0%
Disgusted 0%
Confused 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 99%
Hat 98.3%

Captions

Text analysis

Amazon

TO
E
ORE
Г TO ORE
25e
Г

Google

25. E STO OR
E
OR
25.
STO