Human Generated Data

Title

Untitled (central Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.575

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.575

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Person 98.7
Adult 98.4
Male 98.4
Man 98.4
Person 98.4
Person 98
Adult 97.7
Person 97.7
Bride 97.7
Female 97.7
Wedding 97.7
Woman 97.7
Person 97.2
Person 96.7
Person 95.1
Person 94.9
Person 92.9
Person 91.6
Outdoors 91.3
Person 91.2
People 88.1
Adult 85.9
Person 85.9
Bride 85.9
Female 85.9
Woman 85.9
Clothing 83.1
Hat 83.1
Face 82
Head 82
Furniture 79.4
Transportation 67.2
Vehicle 67.2
Bus Stop 56.8
Bench 56.3
Architecture 55.9
Building 55.9
Shelter 55.9
Dugout 55.9
Sport 55.9
House 55.3
Housing 55.3
Porch 55.3
Shorts 55.2

Clarifai
created on 2018-05-11

people 100
adult 99.1
group together 99.1
group 98.4
child 97.9
two 97.7
man 97.6
three 96.7
four 96.7
wear 96.5
woman 95.1
war 93.4
several 92.9
offspring 92
recreation 91.7
many 91.1
one 91
home 88.8
canine 88.3
vehicle 87.6

Imagga
created on 2023-10-06

car 58.1
model t 37
vehicle 36.6
motor vehicle 33
stall 18.6
transportation 17.9
wheeled vehicle 16.8
old 16.7
auto 14.3
grass 14.2
transport 13.7
truck 13.5
travel 12.7
road 12.6
drive 12.3
outdoor 12.2
landscape 11.9
abandoned 10.7
outdoors 10.5
automobile 10.5
sky 10.2
tree 10
farm 9.8
equipment 9.8
adult 9.7
rural 9.7
broken 9.6
wheel 9.4
summer 9
work 8.8
driving 8.7
rust 8.7
rusty 8.6
male 8.5
house 8.3
seller 8.3
machine 8.3
park 8.2
bench 8.2
field 7.5
man 7.4
vacation 7.4
tractor 7.3
palanquin 7.1
country 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 97.1
person 96.7

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 25-35
Gender Male, 99.9%
Fear 92.6%
Surprised 9.4%
Angry 8.5%
Disgusted 2.6%
Sad 2.5%
Confused 2%
Calm 1.3%
Happy 0.6%

AWS Rekognition

Age 36-44
Gender Female, 95.6%
Calm 92.1%
Surprised 9.8%
Fear 6%
Sad 2.2%
Confused 0.8%
Happy 0.4%
Angry 0.2%
Disgusted 0.2%

AWS Rekognition

Age 48-56
Gender Male, 96.2%
Calm 80%
Sad 15.2%
Surprised 6.4%
Fear 6.2%
Confused 1.6%
Happy 1.2%
Disgusted 1.1%
Angry 0.2%

AWS Rekognition

Age 37-45
Gender Male, 56.4%
Sad 99.5%
Happy 11%
Fear 7.5%
Calm 7.3%
Surprised 7.2%
Disgusted 6.4%
Angry 4.9%
Confused 1%

AWS Rekognition

Age 35-43
Gender Male, 98.2%
Calm 87.6%
Sad 8.4%
Surprised 6.4%
Fear 6%
Angry 0.7%
Happy 0.4%
Disgusted 0.4%
Confused 0.3%

AWS Rekognition

Age 54-64
Gender Male, 97.1%
Surprised 98.5%
Calm 10.9%
Confused 6.7%
Fear 6%
Sad 2.2%
Happy 1%
Disgusted 0.8%
Angry 0.3%

AWS Rekognition

Age 24-34
Gender Female, 51.7%
Sad 99.9%
Surprised 9.3%
Fear 6.3%
Calm 6%
Confused 5.9%
Angry 2.8%
Happy 2.1%
Disgusted 2.1%

AWS Rekognition

Age 21-29
Gender Male, 73.1%
Happy 39.2%
Disgusted 26.7%
Calm 16.8%
Confused 9.1%
Surprised 6.6%
Fear 6.1%
Sad 4.5%
Angry 1.6%

Microsoft Cognitive Services

Age 26
Gender Male

Feature analysis

Amazon

Adult 99.3%
Male 99.3%
Man 99.3%
Person 99.3%
Bride 97.7%
Female 97.7%
Woman 97.7%