Human Generated Data

Title

Untitled (Ozarks, Arkansas)

Date

October 1935, printed later

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3402

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Ozarks, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3402

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Alloy Wheel 99.9
Car Wheel 99.9
Machine 99.9
Spoke 99.9
Tire 99.9
Transportation 99.9
Vehicle 99.9
Clothing 99.4
Coat 99.4
Jacket 99.4
Boy 99.3
Male 99.3
Person 99.3
Teen 99.3
Wheel 99.3
Face 98.9
Head 98.9
Photography 98.9
Portrait 98.9
Person 82.3
Sitting 82.2
Body Part 71.7
Finger 71.7
Hand 71.7
Car 63.1
Pants 55.5

Clarifai
created on 2018-05-10

people 99.8
vehicle 98.9
one 97.7
adult 97.3
child 95.8
transportation system 94
portrait 93.5
monochrome 93.4
facial expression 93.1
nostalgia 92.9
car 92.8
woman 87
wear 87
man 86.7
driver 86.6
actress 85.4
retro 84.9
two 83.6
administration 81.7
convertible 81.6

Imagga
created on 2023-10-06

tire 47.6
hay 42.1
vehicle 34.2
car 33.9
wheel 31
fodder 24.8
tractor 24.6
grass 23.7
hoop 23.6
machine 23.4
farm 22.3
field 21.7
feed 21.5
rural 21.1
plow 18.5
sky 17.9
band 17.4
auto 17.2
outdoors 17.2
tool 17.1
summer 16.7
automobile 16.3
transportation 16.1
equipment 15.7
machinery 15.6
road 15.4
agriculture 14.9
man 14.8
person 14.7
people 14.5
food 13.9
child 13.5
dirt 13.4
work 13.3
drive 13.2
farmer 13.2
outdoor 13
strip 12.6
old 12.5
engine 12.5
farming 12.3
transport 11.9
driving 11.6
rustic 11.6
male 11.4
landscape 11.2
motor 10.6
sitting 10.3
industry 10.2
wheeled vehicle 10.2
spring 10.2
happiness 10.2
smiling 10.1
sport 9.9
travel 9.9
agricultural 9.7
fun 9.7
working 9.7
adult 9.7
men 9.4
speed 9.2
broken 8.7
crop 8.5
countryside 8.2
cart 8
worker 8
autumn 7.9
cannon 7.9
day 7.8
smile 7.8
antique 7.8
straw 7.7
outside 7.7
race 7.6
fashion 7.5
happy 7.5
joy 7.5
barrow 7.3
yellow 7.3
lifestyle 7.2
cute 7.2
motor vehicle 7.1
country 7

Microsoft
created on 2018-05-10

grass 98.7
person 98.6
outdoor 97.6
boy 85.6
young 79.7
posing 35
vintage 28.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 4-12
Gender Male, 100%
Confused 57.3%
Angry 15.6%
Fear 9.5%
Sad 7.8%
Surprised 7.6%
Calm 4.9%
Disgusted 2%
Happy 0.6%

Microsoft Cognitive Services

Age 5
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Boy 99.3%
Male 99.3%
Person 99.3%
Teen 99.3%
Wheel 99.3%
Car 63.1%

Text analysis

Amazon

59%F.

Google

49% F.
49
%
F.