Human Generated Data

Title

Untitled (Omar, Scotts Run, West Virginia)

Date

October 1935, printed later

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3441

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Omar, Scotts Run, West Virginia)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3441

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.2
Human 99.2
Railway 97.4
Rail 97.4
Train Track 97.4
Transportation 97.4
Person 84.9
Building 75.4
Clothing 71.7
Apparel 71.7
Outdoors 67.9
Countryside 60.2
Nature 60.2
Worker 55.1

Clarifai
created on 2023-10-15

people 99.9
railway 99.4
train 99
monochrome 98.9
adult 97.6
street 97
man 96.7
group together 94.4
one 94.3
transportation system 94.3
vehicle 93.4
group 91.2
locomotive 91.1
two 90.5
vintage 88.2
war 86.1
black and white 84.7
grinder 84.1
town 83.9
industry 81.8

Imagga
created on 2021-12-15

track 81.2
train 26.9
travel 26.1
landscape 25.3
transportation 25.1
railroad 24.6
railway 22.6
outdoors 22
sky 18.5
structure 18.1
old 17.4
rails 16.8
transport 16.4
trees 15.1
tourism 14.9
rail 14.7
industry 14.5
road 14.5
architecture 13.4
scenic 13.2
building 13.1
summer 12.9
grass 12.7
mountain 12.5
rural 12.3
outdoor 12.2
water 12
outside 12
station 11.9
industrial 11.8
tracks 11.8
sea 11.7
way 11.5
vacation 11.5
empty 11.2
city 10.8
ocean 10.8
urban 10.5
journey 10.4
pier 10.2
beach 10.1
lake 10.1
steel 9.9
coast 9.9
device 9.8
fence 9.7
vehicle 9.6
barrier 9.5
trip 9.4
tree 9.4
bridge 9.3
machine 9.3
mountains 9.3
house 9.2
pen 9.2
tourist 8.9
metal 8.9
sand 8.7
tie 8.7
day 8.6
cloud 8.6
perspective 8.5
snow 8.1
horizon 8.1
new 8.1
wooden 7.9
country 7.9
transit 7.9
platform 7.9
forest 7.8
cold 7.8
enclosure 7.7
shore 7.7
clouds 7.6
field 7.5
south 7.5
man 7.4
street 7.4
light 7.4
brace 7.3
support 7.2
holiday 7.2
river 7.1
line 7.1
plant 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 98.5
outdoor 97.4
railroad 97.1
ground 95.5
rail 83
black and white 80.8
locomotive 76.1
train 71.8
clothing 71.7
person 61.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-35
Gender Male, 99.8%
Calm 86.1%
Angry 8.2%
Sad 2.3%
Fear 1.7%
Happy 0.8%
Surprised 0.4%
Confused 0.3%
Disgusted 0.2%

AWS Rekognition

Age 38-56
Gender Male, 98.6%
Calm 97.5%
Surprised 0.7%
Confused 0.6%
Angry 0.5%
Sad 0.5%
Happy 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 26-42
Gender Male, 50.9%
Calm 52.2%
Happy 15.1%
Disgusted 11%
Sad 6.4%
Fear 4.6%
Surprised 3.7%
Confused 3.6%
Angry 3.5%

AWS Rekognition

Age 36-54
Gender Female, 76.1%
Happy 96%
Sad 1.4%
Calm 1.4%
Angry 0.4%
Surprised 0.3%
Disgusted 0.2%
Fear 0.2%
Confused 0.1%

AWS Rekognition

Age 39-57
Gender Female, 76.1%
Happy 46.4%
Calm 37.4%
Fear 4.1%
Sad 3.8%
Surprised 2.8%
Confused 2.7%
Angry 2.3%
Disgusted 0.5%

AWS Rekognition

Age 36-54
Gender Male, 89.2%
Happy 87.8%
Calm 6.5%
Sad 2.2%
Fear 1.2%
Angry 1%
Disgusted 0.6%
Surprised 0.5%
Confused 0.3%

AWS Rekognition

Age 47-65
Gender Male, 89.6%
Calm 84%
Surprised 6.9%
Angry 3.2%
Sad 2.9%
Confused 2.7%
Disgusted 0.1%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 27-43
Gender Female, 63%
Happy 38.9%
Angry 26.4%
Sad 13.5%
Calm 9%
Confused 6.6%
Fear 2.3%
Surprised 1.9%
Disgusted 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Categories

Captions

Text analysis

Amazon

MINES
RRIGAN
STABLES
EN
SDAYS
IRENE
IRENE HERVEY
HERVEY
110c
RDROCK
STABLES MINES 785
WARNING
The
MEKROEF The
SMOGE
785
MEKROEF

Google

SDAYS STABLES MINES EN RD ROCK RRIGAN ENE HERVEY
SDAYS
STABLES
MINES
EN
RD
ROCK
RRIGAN
ENE
HERVEY