Human Generated Data

Title

Untitled (man and woman sitting in lawn chairs outside of trailer home)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7566

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man and woman sitting in lawn chairs outside of trailer home)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Nature 98.8
Outdoors 98
Person 97.6
Human 97.6
Person 96.8
Person 96.8
Rural 96.7
Countryside 96.7
Building 96.7
Shelter 96.7
Yard 91.6
Plant 81.5
Tree 81.5
Furniture 70.7
Road 67.2
Hut 65.8
Shack 65.8
Ice 65.2
People 64.4
Fir 61.9
Abies 61.9
Urban 60.7
Weather 59.2
Housing 58.9
Chair 58.7
Table 57
Wall 56.9
Snow 56.4

Imagga
created on 2022-01-08

shopping cart 66.1
handcart 53.4
wheeled vehicle 42.8
negative 29.9
park bench 26.5
film 25.1
container 25.1
bench 24.9
old 21.6
seat 17.6
fence 17.3
structure 17.2
landscape 17.1
picket fence 16.6
photographic paper 16.4
grunge 16.2
sky 15.9
tree 15.8
conveyance 14.3
architecture 14.1
park 14
black 13.8
winter 13.6
building 13.5
snow 12.7
antique 12.1
vintage 11.6
trees 11.6
outdoor 11.5
forest 11.3
photographic equipment 10.9
house 10.9
furniture 10.4
texture 10.4
barrier 10.3
dark 10
city 10
travel 9.9
summer 9.6
pattern 9.6
scene 9.5
frame 9.2
art 9.1
scenery 9
water 8.7
cold 8.6
holiday 8.6
design 8.4
wood 8.3
retro 8.2
light 8
night 8
rural 7.9
day 7.8
season 7.8
sepia 7.8
empty 7.7
silhouette 7.4
tourism 7.4
paint 7.2
sun 7.2
color 7.2
sunset 7.2
history 7.2
paper 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

tree 99.3
text 98.3
black and white 90.7
flower 82.3
furniture 68

Face analysis

Amazon

Google

AWS Rekognition

Age 19-27
Gender Female, 72.9%
Sad 51.5%
Calm 37.7%
Fear 2.5%
Angry 2.3%
Surprised 2.2%
Disgusted 1.6%
Confused 1.3%
Happy 0.9%

AWS Rekognition

Age 21-29
Gender Female, 88%
Calm 96%
Disgusted 1.3%
Happy 1.2%
Sad 0.4%
Angry 0.4%
Confused 0.3%
Fear 0.2%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.6%

Captions

Microsoft

an old photo of a person 74.6%
old photo of a person 70.2%
a group of people sitting around a fire 27.6%

Text analysis

Amazon

63
0 63
0
KODA-A-EITW

Google

MJI7--
YT3RA°2
MJI7-- YT3RA°2 --XAGO
--XAGO