Human Generated Data

Title

Untitled (Marked Tree, Arkansas?)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1214

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Marked Tree, Arkansas?)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1214

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Face 100
Head 100
Photography 100
Portrait 100
Person 99.6
Adult 99.6
Male 99.6
Man 99.6
Clothing 96.5
Hat 96.5
Machine 95.5
Wheel 95.5
Cap 94.7
Car 93.5
Transportation 93.5
Vehicle 93.5
Baseball Cap 91.5
Wheel 90
Car 90
Car 87.1
Wheel 85.4
Wheel 83.2
Spoke 77.9
Antique Car 76.8
Model T 76.8
Alloy Wheel 73.3
Car Wheel 73.3
Tire 73.3
Car 73.1
Outdoors 70.1
Wheel 64.4
Person 62.7
License Plate 57.8
Coat 57
Architecture 56.5
Building 56.5
Shelter 56.5
Nature 56

Clarifai
created on 2018-05-11

people 99.9
man 99.5
adult 99.4
one 99.2
portrait 98.6
two 96.7
group together 96.7
wear 96.4
administration 94.9
group 94.8
three 94
leader 91.6
vehicle 87.9
golfer 86.1
monochrome 86.1
veil 85.7
several 85.6
actor 85.3
war 84.7
music 84.7

Imagga
created on 2023-10-06

man 41
person 36.6
male 34.8
people 32.3
senior 29
happy 28.8
adult 26.6
hat 24.3
outdoors 24.2
elderly 23.9
portrait 23.3
men 22.3
old 20.2
mature 19.5
smile 18.5
grandfather 18
lifestyle 17.3
smiling 16.6
old-timer 16.6
couple 16.5
happiness 15.7
face 15.6
retirement 15.4
casual 15.2
looking 14.4
outdoor 13.8
love 13.4
active 13
outside 12.8
attractive 12.6
age 12.4
worker 12.1
home 12
aged 11.8
standing 11.3
hair 11.1
beach 11
summer 10.9
city 10.8
handsome 10.7
cheerful 10.6
lady 10.6
together 10.5
one 10.4
glasses 10.2
leisure 10
park 9.9
middle aged 9.7
older 9.7
retired 9.7
engineer 9.6
industry 9.4
guy 9.2
fun 9
boy 8.7
day 8.6
model 8.6
pretty 8.4
joy 8.3
sky 8.3
building 8.2
gray 8.1
romance 8
beard 8
women 7.9
grandmother 7.8
sitting 7.7
construction 7.7
life 7.7
health 7.6
husband 7.6
two 7.6
hand 7.6
head 7.6
healthy 7.6
one person 7.5
alone 7.3
black 7.2
sport 7.1
romantic 7.1
family 7.1
working 7.1
work 7.1
sea 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99
man 99
outdoor 98.4
old 75
older 31.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 53-61
Gender Male, 100%
Calm 99.9%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0%
Angry 0%
Happy 0%
Disgusted 0%

Microsoft Cognitive Services

Age 79
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Adult 99.6%
Male 99.6%
Man 99.6%
Wheel 95.5%
Car 93.5%