Human Generated Data

Title

Untitled (elephant being loaded onto truck)

Date

1956

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7903

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (elephant being loaded onto truck)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7903

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 99.7
Person 99.7
Person 99.5
Person 99.1
Person 99
Person 98
Wheel 97.7
Machine 97.7
Transportation 97.6
Truck 97.6
Vehicle 97.6
Elephant 96.6
Animal 96.6
Mammal 96.6
Wildlife 96.6
Nature 95
Wheel 90.1
Outdoors 89.8
Person 82.5
Snow 76.9
Automobile 74.2
Car 74.2
Elephant 61.5
Road 60.9
Winter 55.9

Clarifai
created on 2023-10-25

people 99.4
dog 98.3
canine 97.8
cavalry 97.4
vehicle 96.1
transportation system 95.6
group 95
monochrome 94.3
adult 93.4
man 92.8
mammal 92.6
five 92.3
woman 91.8
child 91.7
group together 90
two 89.8
one 88.6
family 88.1
three 88
little 87.1

Imagga
created on 2022-01-09

truck 44.1
conveyance 39.6
vehicle 30
motor vehicle 25.6
transportation 21.5
wheeled vehicle 21.3
transport 19.2
trailer 18.8
garbage truck 18.3
grass 18.2
car 16.6
sky 16.6
road 16.3
rural 15.9
farm 15.2
field 15.1
travel 14.1
landscape 13.4
van 13
outside 12.8
industrial 11.8
tourism 11.5
auto 11.5
country 11.4
outdoors 11.3
camper 11.2
industry 11.1
summer 10.9
mobile home 10.8
vacation 10.6
tractor 10.5
old 9.8
structure 9.5
housing 9.5
wheel 9.4
work 9.4
dog 9.4
moving van 9.4
clouds 9.3
outdoor 9.2
recreational vehicle 9.1
lorry 9
machine 8.8
bus 8.8
scenic 8.8
man 8.7
cargo 8.7
automobile 8.6
architecture 8.6
drive 8.5
horse 8.2
water 8
agriculture 7.9
holiday 7.9
container 7.8
accident 7.8
male 7.8
adult 7.8
tree 7.7
move 7.7
person 7.4
safety 7.4
island 7.3
sunset 7.2
sea 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.8
vehicle 85.7
outdoor 85.1
truck 85.1
firefighter 75.7
land vehicle 69.6
posing 35.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 98.4%
Calm 64.3%
Sad 22%
Angry 4.3%
Happy 3.4%
Disgusted 2.9%
Confused 1.5%
Surprised 1%
Fear 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.7%
Wheel 97.7%
Truck 97.6%
Elephant 96.6%
Car 74.2%

Text analysis

Amazon

CIRCUS
it's
it's CIRCUS time
It's CIRCUS TIME
TIME
It's
time
-
Bros.
01945.
THE

Google

S
its
times
CIRCUS
S ORCUS its CIRCUS times CIRCUS
ORCUS