Human Generated Data

Title

Untitled (Camden, Tennessee)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1408

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Camden, Tennessee)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1408

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.7
Human 99.7
Person 99.1
Person 98.6
Person 97.8
Motor Scooter 95.6
Vespa 95.6
Motorcycle 95.6
Vehicle 95.6
Transportation 95.6
Person 95
Person 93.7
Person 93.1
Moped 86.2
Clothing 86
Apparel 86
Person 67.1
Person 66.5
Hat 56.9

Clarifai
created on 2023-10-15

people 99.9
many 98.9
group together 98.2
street 98.1
monochrome 97.3
man 95.8
group 95.7
adult 94.3
transportation system 90.1
woman 88.9
cavalry 88.5
chair 88.4
crowd 86.2
child 84.9
wear 77.5
vehicle 75.8
boy 75.4
war 73.5
soldier 70.4
wagon 69.8

Imagga
created on 2021-12-15

half track 100
vehicle 93.9
military vehicle 89.5
tracked vehicle 89
wheeled vehicle 47.6
conveyance 46.3
military uniform 42.5
uniform 37.4
city 28.3
clothing 23.9
architecture 20.3
travel 19.7
transportation 18.8
locomotive 17.7
steam 17.5
building 17.4
consumer goods 17
covering 16.8
old 16.7
history 16.1
sky 15.3
street 13.8
transport 13.7
train 13.5
smoke 13
railway 12.7
house 12.5
power 11.8
military 11.6
tourism 11.5
urban 11.4
industrial 10.9
railroad 10.8
vacation 10.6
summer 10.3
tank 9.9
track 9.8
historical 9.4
water 9.3
town 9.3
tree 9.2
outdoor 9.2
landscape 8.9
war 8.7
roof 8.6
industry 8.5
commodity 8.5
vintage 8.3
gun 8.2
historic 8.2
machine 8
rural 7.9
soldier 7.8
steam locomotive 7.7
cityscape 7.6
buildings 7.6
outdoors 7.5
car 7.3
road 7.2
steel 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

sky 99
outdoor 98.2
text 95.7
person 87.1
clothing 80.1
vehicle 78
man 51.2
crowd 0.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-40
Gender Female, 68%
Calm 94.2%
Sad 3.4%
Fear 0.8%
Angry 0.4%
Happy 0.4%
Surprised 0.3%
Confused 0.3%
Disgusted 0.1%

AWS Rekognition

Age 36-52
Gender Male, 92.3%
Calm 69.6%
Sad 23.6%
Angry 1.8%
Confused 1.7%
Surprised 1.6%
Fear 0.9%
Disgusted 0.5%
Happy 0.4%

AWS Rekognition

Age 48-66
Gender Male, 85%
Calm 98.6%
Angry 0.5%
Sad 0.4%
Surprised 0.3%
Confused 0.1%
Happy 0.1%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 23-37
Gender Female, 93.7%
Sad 45.5%
Fear 25.1%
Happy 14%
Calm 10.1%
Confused 2.7%
Surprised 1.1%
Angry 0.9%
Disgusted 0.6%

AWS Rekognition

Age 24-38
Gender Male, 96.7%
Calm 98.4%
Sad 0.9%
Surprised 0.3%
Angry 0.2%
Confused 0.1%
Fear 0.1%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 29-45
Gender Female, 93.4%
Calm 64.3%
Happy 31.3%
Sad 2.3%
Angry 0.8%
Fear 0.5%
Surprised 0.3%
Confused 0.2%
Disgusted 0.2%

Microsoft Cognitive Services

Age 26
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Text analysis

Amazon

BUILDI
THREADG
HARD

Google

THREADG HARD BUILDI
THREADG
HARD
BUILDI