Human Generated Data

Title

Untitled (Ozarks, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2815

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Ozarks, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2815

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Boy 98.5
Child 98.5
Male 98.5
Person 98.5
Male 97.7
Person 97.7
Adult 97.7
Man 97.7
Face 96.6
Head 96.6
Photography 96.6
Portrait 96.6
Person 95.3
Baby 95.3
Person 88
Clothing 87.5
Coat 87.5
Gun 87.3
Weapon 87.3
Outdoors 78
Glove 74.3
Car 72.4
Transportation 72.4
Vehicle 72.4
Nature 60.1
Firearm 60.1
Hat 57.8
Cap 55.8
Rifle 55.7
Windshield 55

Clarifai
created on 2018-05-10

people 99.9
vehicle 99.5
adult 99.3
one 98.8
transportation system 97.3
two 97.1
portrait 96.4
child 96.3
man 96.1
war 95.2
military 94.4
wear 94
soldier 92.6
group 91.8
outfit 89.6
car 89.5
woman 88.9
group together 88.2
three 86.8
aircraft 86.8

Imagga
created on 2023-10-06

car 59.9
vehicle 41.2
automobile 37.3
driver 36.9
person 31.7
transportation 30.5
sitting 30.1
auto 27.7
carriage 27.6
people 27.3
seat 27.3
man 26.2
driving 26.1
adult 24.6
business 24.3
drive 23.6
happy 21.9
mirror 20.7
smile 20
male 19.9
car mirror 18.3
transport 18.3
work 18
laptop 17.8
motor vehicle 17.4
computer 16.9
attractive 16.8
smiling 16.6
windshield wiper 15.6
pretty 15.4
portrait 14.9
device 14.7
support 14.6
corporate 14.6
outdoors 14.2
wheel 14.1
businesswoman 13.6
professional 13.5
travel 13.4
happiness 13.3
businessman 13.2
office 12.8
face 12.8
one 12.7
technology 12.6
mechanical device 12.6
hand 12.1
looking 12
hair 11.9
road 11.7
new 11.3
passenger 11.2
window 11
reflector 10.9
job 10.6
working 10.6
sit 10.4
inside 10.1
lifestyle 10.1
model 10.1
cute 10
student 10
suit 9.9
mechanism 9.8
fashion 9.8
cheerful 9.7
summer 9.6
chair 9.4
youth 9.4
communication 9.2
outdoor 9.2
modern 9.1
holding 9.1
human 9
lady 8.9
worker 8.9
steering 8.9
rumble 8.8
boy 8.7
notebook 8.6
wheeled vehicle 8.6
trip 8.5
golf equipment 8.3
model t 8.3
success 8
sexy 8
interior 8
brunette 7.8
windshield 7.8
motor 7.7
casual 7.6
traffic 7.6
joy 7.5
fun 7.5
phone 7.4
executive 7.4
successful 7.3
squeegee 7.3
black 7.2
life 7.2
steering wheel 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 97.2
person 96.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 33-41
Gender Male, 99.9%
Calm 99.9%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0%
Confused 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 4-12
Gender Male, 99.6%
Sad 97.4%
Confused 38.4%
Fear 7.9%
Surprised 7.3%
Calm 1.3%
Angry 0.7%
Disgusted 0.5%
Happy 0.3%

AWS Rekognition

Age 1-7
Gender Female, 50.5%
Calm 59.5%
Confused 15.5%
Surprised 14.2%
Fear 6.3%
Happy 5.6%
Sad 3%
Angry 3%
Disgusted 2.2%

Microsoft Cognitive Services

Age 6
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Boy 98.5%
Child 98.5%
Male 98.5%
Person 98.5%
Adult 97.7%
Man 97.7%
Baby 95.3%
Gun 87.3%
Glove 74.3%
Car 72.4%