Human Generated Data

Title

Untitled (fallen horse on street in winter)

Date

1940s

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1354

Human Generated Data

Title

Untitled (fallen horse on street in winter)

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Date

1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1354

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Human 99.2
Person 99.2
Person 99.1
Truck 98.1
Transportation 98.1
Vehicle 98.1
Person 96.3
Person 93.9
Person 92
Car 88.9
Automobile 88.9
Car 88.7
Clothing 82.5
Apparel 82.5
Mammal 77.6
Animal 77.6
Spoke 61.9
Machine 61.9
Bull 59.7
Arrow 56.8
Symbol 56.8

Clarifai
created on 2023-10-26

people 100
group together 99.5
adult 99.1
vehicle 98.6
waste 98.5
street 98.3
group 98.1
many 97.9
war 96.7
man 96.2
several 94.4
transportation system 94.3
administration 93.3
interaction 92
skirmish 92
wear 91
soldier 91
military 90.9
reclining 89.3
furniture 89.3

Imagga
created on 2022-01-22

stretcher 39.1
litter 31.3
engineer 28.1
man 24.2
conveyance 24
television camera 23.1
equipment 22.7
male 19.8
people 18.9
television equipment 18.5
work 18
industry 17.1
person 17.1
old 15.3
worker 15.1
steel 14.1
electronic equipment 14.1
industrial 13.6
working 13.2
safety 12.9
outdoors 12.7
job 12.4
machinist 12.4
adult 12.3
machine 12.1
travel 11.3
metal 11.2
factory 11.2
construction 10.3
lifestyle 10.1
power 10.1
helmet 9.8
business 9.7
building 9.6
photographer 9.5
leisure 9.1
protection 9.1
danger 9.1
uniform 9
vehicle 8.8
soldier 8.8
car 8.8
architecture 8.6
men 8.6
fire 8.4
machinery 8
to 8
mask 7.8
plant 7.6
heavy 7.6
wheel 7.5
field 7.5
city 7.5
one 7.5
sport 7.5
heat 7.4
environment 7.4
gun 7.4
street 7.4
bag 7.2
farm 7.1
day 7.1
backpack 7

Microsoft
created on 2022-01-22

outdoor 99.4
black and white 91.2
person 90.9
man 81.7
clothing 78.6
firefighter 76
text 70.1
vehicle 68.4
land vehicle 66.1
monochrome 65.8
people 60.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 18-26
Gender Male, 97.2%
Calm 61.1%
Sad 30.2%
Fear 3.3%
Confused 2.6%
Angry 1.2%
Disgusted 0.8%
Surprised 0.6%
Happy 0.3%

AWS Rekognition

Age 20-28
Gender Male, 99.5%
Calm 97.5%
Sad 0.8%
Fear 0.6%
Angry 0.3%
Happy 0.3%
Surprised 0.2%
Confused 0.2%
Disgusted 0.2%

AWS Rekognition

Age 42-50
Gender Male, 99.9%
Calm 46%
Angry 10%
Sad 8.9%
Happy 8.4%
Fear 7.8%
Surprised 7.8%
Disgusted 7%
Confused 4.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Truck 98.1%
Car 88.9%

Categories

Text analysis

Amazon

OOF'S
LIQU
the LIQU
the

Google

RODF'S
RODF'S