Human Generated Data

Title

Untitled (three women sitting in front of trailer)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8704

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (three women sitting in front of trailer)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8704

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.8
Human 99.8
Person 98.3
Person 95.3
Nature 94.3
Outdoors 91.5
Chair 88.9
Furniture 88.9
Tent 88.1
Yard 82.6
Meal 76.3
Food 76.3
People 71.5
Shelter 70.3
Countryside 70.3
Building 70.3
Rural 70.3
Clothing 66.2
Apparel 66.2
Face 65.9
Patio 57.1
Restaurant 55.6

Clarifai
created on 2023-10-25

tent 99.6
people 99.2
group 94.5
furniture 94.3
man 92.5
campsite 91.7
chair 91.2
monochrome 91
group together 88
adult 87.4
camp 86.8
family 86.4
war 86.3
no person 83.4
street 82.8
table 80.5
house 80
home 79.8
music 79.8
sunblind 79

Imagga
created on 2022-01-09

stall 25
building 20.9
city 16.6
structure 16.1
sky 15.9
architecture 15.8
night 15.1
chair 14.1
travel 14.1
urban 14
sunset 13.5
silhouette 13.2
musical instrument 13.1
percussion instrument 12.7
landscape 12.6
dark 11.7
business 11.5
light 11.4
shopping cart 11.3
modern 11.2
transport 11
wheeled vehicle 10.9
road 10.8
sun 10.5
street 10.1
island 10.1
restaurant 10.1
house 10
water 10
transportation 9.9
billboard 9.5
man 9.4
industry 9.4
sea 9.4
evening 9.3
piano 9.3
window 9.3
stage 9.1
people 8.9
container 8.7
furniture 8.7
platform 8.6
dusk 8.6
sunrise 8.4
beach 8.4
smoke 8.4
handcart 8.3
industrial 8.2
tower 8
keyboard instrument 8
equipment 8
steel 8
stringed instrument 7.7
outdoor 7.6
device 7.5
destination 7.5
ocean 7.5
tourism 7.4
vacation 7.4
landmark 7.2
seat 7.2
grand piano 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.5
black and white 96.7
furniture 94.4
house 90.1
outdoor 88.1
chair 86.9
table 85.9
monochrome 79.1
white 69.8
black 67.7
building 54.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 16-24
Gender Female, 54.4%
Calm 58.8%
Happy 30.3%
Sad 7.4%
Surprised 1.2%
Fear 0.7%
Confused 0.6%
Angry 0.5%
Disgusted 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Tent 88.1%

Categories

Captions

Text analysis

Amazon

EL
39853
VAROY

Google

39853
39853