Human Generated Data

Title

Untitled (men, women and cars near a brick house)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8293

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men, women and cars near a brick house)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8293

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.1
Human 99.1
Clothing 99.1
Apparel 99.1
Person 99
Person 98.9
Person 95.3
Car 90.5
Vehicle 90.5
Automobile 90.5
Transportation 90.5
Car 88.5
Nature 82.6
Wheel 81.7
Machine 81.7
Wheel 80.4
Car 78.2
Overcoat 77.2
Coat 77.2
Suit 77.2
Car 76.9
Outdoors 76.2
People 73.4
Person 72
Housing 67.2
Building 67.2
Robe 63.7
Fashion 63.7
Plant 62.1
Face 60.2
Gown 59
Grass 58.5
House 57.6
Person 57.3
Wedding 55.9
Countryside 55.8
Wedding Gown 55.4

Clarifai
created on 2023-10-25

people 99.9
many 98.7
group together 98.6
group 98
adult 97.5
man 96.3
vehicle 95.9
war 93.8
military 91.5
woman 89.5
several 89.5
leader 87.9
home 86.6
soldier 85.9
child 84.8
uniform 84.6
transportation system 84.4
administration 84.3
street 82.4
monochrome 81.7

Imagga
created on 2022-01-08

sky 25.5
picket fence 20.7
dairy 20
fence 18.6
cemetery 18
landscape 17.9
tree 16.2
structure 15.9
scene 15.6
billboard 14.9
park 14.5
travel 14.1
barrier 13.8
outdoor 13.8
road 13.6
bench 12.6
field 12.5
trees 12.5
rural 12.3
park bench 12.3
signboard 12.1
cloud 12.1
winter 11.9
snow 11.9
old 11.8
transportation 11.7
environment 11.5
clouds 11
vehicle 10.6
sand 10.5
summer 10.3
smoke 10.2
street 10.1
water 10
danger 10
industrial 10
steam 9.7
country 9.7
fog 9.7
forest 9.6
industry 9.4
architecture 9.4
seat 9.3
wheeled vehicle 9.2
black 9
vacation 9
sunset 9
river 8.9
destruction 8.8
grass 8.7
cold 8.6
season 8.6
obstruction 8.6
speed 8.2
dirty 8.1
horizon 8.1
night 8
scenic 7.9
disaster 7.8
wood 7.5
city 7.5
countryside 7.3
transport 7.3
protection 7.3
history 7.2
farm 7.1
day 7.1
sea 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

outdoor 99.4
text 99.3
tree 98
road 96.4
person 76.4
clothing 70.6
car 55.5
old 48.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Female, 77.7%
Sad 69%
Calm 14.7%
Confused 10.6%
Happy 2%
Disgusted 1.4%
Fear 1.1%
Surprised 0.8%
Angry 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Car 90.5%
Wheel 81.7%

Text analysis

Amazon

9695
9695.
31051
АЗДА
MUZ АЗДА
MUZ

Google

9695.
A70A
9695 9695. 9695. FA2 A70A
9695
FA2