Human Generated Data

Title

Untitled (people gathered around and sitting on top of a car, Princeton University reunion, Princeton, NJ)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7446

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people gathered around and sitting on top of a car, Princeton University reunion, Princeton, NJ)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7446

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.8
Human 99.8
Person 99.6
Person 99.6
Person 99.3
Person 99.1
Person 98.8
Car 98.3
Automobile 98.3
Vehicle 98.3
Transportation 98.3
Person 97.1
Person 95.7
Person 93.8
Person 90.5
Person 85.4
Wheel 80.3
Machine 80.3
Clothing 80.1
Apparel 80.1
Spoke 77.9
People 77.8
Person 76.7
Shorts 66
Clinic 59.4
Overcoat 55.1
Coat 55.1

Clarifai
created on 2023-10-25

people 99
monochrome 99
many 95.2
transportation system 92.8
vehicle 92.8
man 91.6
adult 87.8
group 86.1
group together 84.5
woman 81.3
war 77.3
container 75.8
black and white 75.8
administration 75.6
military 75.4
street 73.6
car 71.6
retro 71.3
nostalgia 70.6
box 69.2

Imagga
created on 2022-01-08

graffito 37.6
decoration 28
vehicle 25.4
ashcan 21.2
old 18.1
grunge 17.9
bin 17.8
container 17.1
structure 15.1
travel 14.1
wheeled vehicle 13
landscape 12.6
car 12.6
snow 12.6
vintage 12.4
water 12
dirty 11.7
transportation 11.7
art 11.3
scene 11.2
stone 11.1
industrial 10.9
sky 10.8
black 10.8
city 10.8
summer 10.3
house 10
texture 9.7
scenic 9.7
winter 9.4
holiday 9.3
military vehicle 9.3
power 9.2
transport 9.1
billboard 8.8
urban 8.7
antique 8.7
track 8.6
architecture 8.6
tree 8.5
gravestone 8.5
outdoor 8.4
tracked vehicle 8.3
tourism 8.2
freight car 8.2
retro 8.2
vacation 8.2
aged 8.1
road 8.1
sea 7.8
industry 7.7
clouds 7.6
horizontal 7.5
frame 7.5
ocean 7.5
silhouette 7.4
cemetery 7.4
building 7.4
speed 7.3
design 7.3
business 7.3
danger 7.3
paint 7.2
scenery 7.2
machine 7.2
coast 7.2
memorial 7.2
signboard 7.1
rural 7
season 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 96.2
vehicle 67.1
black and white 57.4
posing 41.4
several 11.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 60.3%
Calm 70.1%
Happy 20.3%
Sad 6.6%
Angry 0.7%
Fear 0.7%
Confused 0.6%
Disgusted 0.6%
Surprised 0.5%

AWS Rekognition

Age 26-36
Gender Female, 74.5%
Happy 32.8%
Calm 28.9%
Sad 24.3%
Angry 4.3%
Confused 3.5%
Disgusted 2.2%
Fear 2.2%
Surprised 1.7%

AWS Rekognition

Age 22-30
Gender Male, 59%
Calm 43.8%
Happy 20.8%
Disgusted 18.2%
Sad 5.5%
Surprised 4.4%
Fear 3.5%
Angry 3.1%
Confused 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Car 98.3%
Wheel 80.3%

Categories

Imagga

paintings art 99.9%

Text analysis

Amazon

EN
29
MAY
GOOD
MY GOOD
MY
MAY U
U
33
HOLM

Google

MAY U EN MY GOOD 29 ल
MAY
MY
U
EN
GOOD
29