Human Generated Data

Title

Untitled (men inspecting school bus accident at railroad crossing)

Date

1950

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10804

Human Generated Data

Title

Untitled (men inspecting school bus accident at railroad crossing)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.4
Human 99.4
Person 99.3
Transportation 89.9
Boat 89.9
Vehicle 89.9
Nature 84.1
Smoke 71.9
People 67.4
Crowd 66.4
Tarmac 66
Asphalt 66
Fog 62.7

Imagga
created on 2022-01-29

blackboard 83.3
stage 49.4
platform 38.4
silhouette 28.1
lights 22.3
symbol 22.2
crowd 22.1
audience 20.5
player 19.8
stadium 19.8
people 19.5
field 19.2
patriotic 19.2
nation 18.9
flag 18.7
cheering 18.6
design 18.6
event 18.5
icon 18.2
person 17.7
nighttime 17.6
skill 17.3
muscular 17.2
training 16.6
championship 16.5
sport 16.5
competition 16.5
match 16.4
athlete 16.1
billboard 15.4
vibrant 14.9
park 14.8
structure 14.8
sign 14.3
man 14.1
sky 14
negative 13.8
city 13.3
urban 13.1
glowing 12.9
signboard 12.7
bright 12.2
business 12.2
team 11.6
cityscape 11.4
building 11.2
shiny 11.1
grunge 11.1
backhand 10.9
versus 10.8
black 10.8
racket 10.8
shorts 10.8
serve 10.7
tennis 10.7
court 10.7
light 10.7
film 10.7
male 10.6
skyline 10.5
architecture 10.2
street 10.1
road 9.9
teamwork 9.3
world 9.2
travel 9.2
hand 9.1
global 9.1
star 9
education 8.7
scene 8.7
downtown 8.7
construction 8.6
modern 8.4
old 8.4
retro 8.2
landscape 8.2
technology 8.2
transportation 8.1
art 8
night 8
shoot 7.7
soccer 7.7
goal 7.7
football 7.7
human 7.5
vivid 7.4
transport 7.3
graphic 7.3
digital 7.3
group 7.3
office 7.2
computer 7.2
tower 7.2
businessman 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 98.9
outdoor 96.3
white 66.4
black and white 55.4

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Female, 73%
Sad 28.5%
Disgusted 24.7%
Calm 21.3%
Fear 10.4%
Confused 5.4%
Angry 5.4%
Surprised 2.4%
Happy 1.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.4%
Boat 89.9%

Captions

Microsoft

a group of people standing in front of a sign 64.5%
a man standing in front of a sign 63.7%
an old photo of a man 63.6%

Text analysis

Amazon

ROAD
RAIL
CROSSING

Google

CROSSING
RAIL ROAD CROSSING
ROAD
RAIL