Human Generated Data

Title

Untitled (men in hats and ties standing in front of outdoor machinery)

Date

c. 1940

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1783

Human Generated Data

Title

Untitled (men in hats and ties standing in front of outdoor machinery)

People

Artist: John Deusing, American active 1940s

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1783

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.8
Human 99.8
Person 99.2
Person 98.6
Person 88.3
Nature 87.9
Transportation 84.8
Vehicle 84.1
Outdoors 74.4
People 57.7

Clarifai
created on 2023-10-15

vehicle 99.8
people 99.4
car 98.4
transportation system 96.1
adult 95.9
man 95.3
campsite 93.9
monochrome 93.9
vintage 92.6
group 92.1
tree 91.8
driver 91.1
child 88.9
home 87.5
road 86.7
group together 86.4
police 85.7
snow 84.2
retro 83.3
woman 83.2

Imagga
created on 2021-12-14

picket fence 30.5
landscape 28.3
fence 27.6
structure 24.2
sky 23.7
tree 21.3
forest 20.9
negative 20.1
winter 19.6
barrier 19.1
rural 18.5
wheeled vehicle 17.6
snow 17.6
vehicle 17.6
film 16.2
old 16
trees 16
outdoor 15.3
clouds 15.2
field 15.1
park 14.8
container 14.1
country 14
cold 13.8
scenery 13.5
season 13.3
obstruction 12.8
countryside 12.8
dark 12.5
mobile home 12.4
scene 12.1
sunset 11.7
environment 11.5
ashcan 11.4
autumn 11.4
scenic 11.4
cloud 11.2
grunge 11.1
photographic paper 11.1
road 10.8
vintage 10.8
river 10.7
fog 10.6
weather 10.5
ice 10.3
grain 10.2
car 10.1
bin 10.1
natural 10
trailer 10
morning 9.9
travel 9.9
housing 9.8
sun 9.8
summer 9.6
antique 9.5
grass 9.5
water 9.3
evening 9.3
dirty 9
black 9
empty 8.6
outside 8.6
wood 8.3
sand 8.3
brown 8.1
transportation 8.1
farm 8
light 8
night 8
misty 7.9
beach 7.8
space 7.8
mist 7.7
frozen 7.6
old fashioned 7.6
house 7.5
hill 7.5
building 7.4
photographic equipment 7.4
branch 7.3
holiday 7.2
mountain 7.1
motor vehicle 7.1

Microsoft
created on 2021-12-14

outdoor 99.9
tree 99.6
text 99.3
white 77.2
transport 69.5
old 67.9
vintage 29.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-45
Gender Male, 59.6%
Calm 99%
Happy 0.5%
Surprised 0.1%
Sad 0.1%
Angry 0.1%
Disgusted 0.1%
Confused 0.1%
Fear 0%

AWS Rekognition

Age 17-29
Gender Male, 69.1%
Calm 99%
Happy 0.7%
Surprised 0.1%
Angry 0.1%
Sad 0.1%
Disgusted 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 26-42
Gender Male, 96.5%
Calm 86.4%
Disgusted 4.7%
Sad 3.8%
Confused 1.4%
Surprised 1.2%
Happy 1%
Fear 0.9%
Angry 0.5%

AWS Rekognition

Age 36-54
Gender Male, 65.7%
Calm 99.2%
Happy 0.5%
Sad 0.1%
Surprised 0.1%
Angry 0.1%
Confused 0.1%
Disgusted 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.8%

Categories

Imagga

paintings art 98.2%

Text analysis

Amazon

CUCEA
EVERY