Human Generated Data

Title

Son of destitute Ozark family, Arkansas

Date

1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3073

Human Generated Data

Title

Son of destitute Ozark family, Arkansas

People

Artist: Ben Shahn, American 1898 - 1969

Date

1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3073

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Wheel 99.9
Machine 99.9
Tire 99.6
Person 99.5
Human 99.5
Clothing 81.1
Apparel 81.1
Car Wheel 79.4
Vehicle 61.2
Transportation 61.2
Spoke 56.4

Clarifai
created on 2023-10-15

people 99.9
vehicle 99.4
transportation system 98.4
one 98.4
portrait 98.3
two 98.3
adult 98.2
car 97.8
child 96.5
vintage 96.2
man 93.1
driver 92.5
boy 91.7
retro 91.5
wheel 89.3
woman 89.2
nostalgia 85.5
sepia 84.7
monochrome 81.9
girl 81.5

Imagga
created on 2021-12-15

tire 63
hoop 42.4
car 37.3
band 31.3
adult 25.2
person 25.1
child 24.9
wheel 24.8
automobile 23
people 22.9
vehicle 22.6
hay 21.5
auto 21
outdoors 20.9
strip 20.8
field 20.1
grass 19.8
sitting 18.9
smiling 18.8
summer 18.6
one 17.9
lifestyle 17.3
outdoor 16.8
smile 16.4
transportation 16.1
cute 15.8
happy 15.7
portrait 15.5
pretty 15.4
attractive 15.4
fashion 15.1
happiness 14.9
man 14.8
male 14.3
fodder 14.2
drive 14.2
model 13.2
road 12.6
driver 12.6
day 12.5
joy 12.5
men 12
outside 12
fun 12
hair 11.9
leisure 11.6
cover girl 11.1
youth 11.1
relaxation 10.9
rural 10.6
feed 10.5
looking 10.4
love 10.3
casual 10.2
freedom 10.1
face 9.9
meadow 9.9
cheerful 9.7
lady 9.7
seat 9.6
couple 9.6
rest 9.4
teenager 9.1
park 9.1
sky 8.9
little 8.8
country 8.8
work 8.7
boy 8.7
spring 8.6
groom 8.1
sexy 8
family 8
posing 8
women 7.9
autumn 7.9
together 7.9
look 7.9
travel 7.7
driving 7.7
broken 7.7
black 7.6
relax 7.6
suit 7.6
food 7.4
environment 7.4
sun 7.2
dress 7.2
childhood 7.2
farm 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

person 99.2
text 99.1
outdoor 99.1
grass 96.5
tire 90.6
wheel 88.5
clothing 86.9
boy 86.1
young 84.7
tractor 81
land vehicle 80.4
auto part 74.5
black and white 74.4
vehicle 67.9
human face 61.8
posing 46.1
vintage 28.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 2-8
Gender Female, 58.5%
Angry 41%
Sad 17.5%
Fear 15%
Confused 14%
Calm 6.7%
Surprised 4%
Disgusted 1.2%
Happy 0.7%

AWS Rekognition

Age 21-33
Gender Female, 58.3%
Calm 90.5%
Sad 6.3%
Fear 2.1%
Angry 0.4%
Happy 0.3%
Surprised 0.2%
Confused 0.1%
Disgusted 0.1%

Microsoft Cognitive Services

Age 6
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Wheel 99.9%
Person 99.5%