Human Generated Data

Title

Untitled (Marked Tree, Arkansas?)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1213

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Marked Tree, Arkansas?)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1213

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

City 100
Road 100
Street 100
Urban 100
Path 100
Sidewalk 100
Architecture 99.6
Building 99.6
Outdoors 99.6
Shelter 99.6
Face 99.4
Head 99.4
Photography 99.4
Portrait 99.4
Person 98.5
Adult 98.5
Male 98.5
Man 98.5
Person 98.5
Adult 98.5
Male 98.5
Man 98.5
Person 96.5
Person 95.1
Person 95
Bus Stop 93.3
Terminal 91.8
Car 85.2
Transportation 85.2
Vehicle 85.2
Clothing 76.1
Formal Wear 76.1
Suit 76.1
Alley 71.8
Machine 68.2
Wheel 68.2
Car 63.6
Person 61.7
Coat 58
Railway 57.2
Train 57.2
Train Station 57.2
Walking 55.6
Selfie 55.3

Clarifai
created on 2018-05-11

people 99.9
adult 98.8
man 98.1
group 95.9
portrait 95.6
wear 93.9
one 91.8
group together 91.6
leader 90.2
administration 89.9
vehicle 87.5
two 86.6
street 84.9
music 83.8
military 83.7
many 82
war 81
outfit 80.3
offense 78.4
woman 75

Imagga
created on 2023-10-06

man 42.4
male 37.4
person 33.8
people 31.3
professional 29.7
office 29.2
adult 28.5
businessman 28.3
business 27.3
happy 25.1
tourist 22.1
smiling 21
work 20.4
men 19.8
job 19.5
executive 18.7
suit 18.6
smile 18.5
traveler 18.4
attractive 18.2
meeting 17.9
team 17
portrait 16.8
working 16.8
looking 16
corporate 15.5
couple 14.8
laptop 14.6
confident 14.6
building 14.5
sitting 13.7
group 13.7
businesswoman 13.6
outdoors 13.4
handsome 13.4
worker 13.2
dad 12.8
computer 12.8
father 12.4
holding 12.4
manager 12.1
success 12.1
teamwork 12.1
outside 12
occupation 11.9
happiness 11.8
colleagues 11.7
lifestyle 11.6
boss 11.5
talking 11.4
businesspeople 11.4
successful 11
communication 10.9
together 10.5
passenger 10.5
employee 10.4
women 10.3
mature 10.2
clothing 10.1
face 9.9
student 9.9
medical 9.7
indoors 9.7
parent 9.5
serious 9.5
tie 9.5
expression 9.4
love 8.7
child 8.7
leadership 8.6
workplace 8.6
desk 8.5
black 8.4
pretty 8.4
inside 8.3
fun 8.2
technology 8.2
look 7.9
diverse 7.8
discussion 7.8
lab 7.8
husband 7.7
diversity 7.7
exam 7.7
profession 7.7
two 7.6
ethnic 7.6
necktie 7.6
hand 7.6
coat 7.6
career 7.6
doctor 7.5
senior 7.5
human 7.5
car 7.5
shirt 7.5
one 7.5
company 7.4
friendly 7.3
teenager 7.3
school 7.2
paper 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 98.9
man 94.2
outdoor 87.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-31
Gender Male, 99.8%
Calm 87%
Surprised 6.4%
Fear 6.2%
Confused 5.9%
Sad 4%
Happy 0.7%
Angry 0.4%
Disgusted 0.3%

AWS Rekognition

Age 37-45
Gender Male, 99.9%
Calm 79.4%
Sad 11.1%
Surprised 6.7%
Fear 6.2%
Angry 2.7%
Confused 1.9%
Happy 1.4%
Disgusted 0.6%

AWS Rekognition

Age 20-28
Gender Male, 75.3%
Calm 96.8%
Surprised 6.3%
Fear 5.9%
Sad 2.6%
Confused 1.4%
Happy 0.3%
Angry 0.1%
Disgusted 0.1%

Microsoft Cognitive Services

Age 36
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Adult 98.5%
Male 98.5%
Man 98.5%
Car 85.2%
Suit 76.1%
Wheel 68.2%

Categories

Text analysis

Amazon

ODEN