Human Generated Data

Title

Untitled (Ozarks, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1104

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Ozarks, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1104

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Machine 100
Spoke 100
Tire 99.9
Alloy Wheel 99.8
Car Wheel 99.8
Transportation 99.8
Vehicle 99.8
Clothing 99.8
Coat 99.8
Wheel 99.5
Boy 99.4
Child 99.4
Male 99.4
Person 99.4
Jacket 98.9
Face 97.5
Head 97.5
Photography 97.5
Portrait 97.5
Car 70.2
Body Part 66.6
Finger 66.6
Hand 66.6
Sitting 61.7
Axle 55.9

Clarifai
created on 2018-05-11

people 99.9
one 99
vehicle 98.9
adult 97.7
child 97
transportation system 94.9
two 92.2
group 90.5
portrait 90.3
man 89.1
facial expression 88.8
military 88.8
war 88.4
boy 87.8
wear 87
administration 85.3
monochrome 84.9
woman 83.8
car 83.5
retro 80.7

Imagga
created on 2023-10-05

hay 37.5
vehicle 33.3
machine 27.9
tractor 27
wheel 26.3
grass 23.7
car 22.3
tire 21.8
child 21.7
machinery 21.4
farm 21.4
fodder 20.1
field 20.1
rural 19.4
feed 18.7
outdoors 18.7
summer 18.6
equipment 18.4
person 16.1
agriculture 15.8
old 15.3
sky 15.3
dirt 15.3
farming 14.2
outdoor 13.8
industry 13.7
transportation 13.4
work 13.4
auto 13.4
cart 13.3
plow 13.1
food 13
farmer 12.9
man 12.8
tool 12.6
wagon 12.6
heavy 12.4
working 12.4
portrait 12.3
wheeled vehicle 12.1
countryside 11.9
transport 11.9
road 11.7
people 11.7
engine 11.6
smiling 11.6
adult 11
agricultural 10.7
park 10.7
happy 10.6
earth 10.4
landscape 10.4
drive 10.4
cute 10
autumn 9.7
sitting 9.4
outside 9.4
rustic 9.1
truck 9.1
industrial 9.1
horse cart 8.8
motor 8.7
straw 8.7
male 8.7
spring 8.6
harvest 8.5
pretty 8.4
yellow 7.9
smile 7.8
happiness 7.8
wheat 7.7
automobile 7.7
race 7.6
power 7.6
crop 7.5
sport 7.4
land 7.4
natural 7.4
metal 7.2
meadow 7.2
face 7.1
kid 7.1
day 7.1

Microsoft
created on 2018-05-11

outdoor 99.4
person 99.2
boy 94.4
young 81.3
outdoor object 30

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 6-12
Gender Male, 100%
Sad 66.4%
Confused 32.4%
Angry 13.9%
Fear 9.4%
Calm 7.4%
Surprised 7.4%
Disgusted 4.1%
Happy 1%

Microsoft Cognitive Services

Age 5
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Wheel 99.5%
Boy 99.4%
Child 99.4%
Male 99.4%
Person 99.4%
Car 70.2%

Captions