Human Generated Data

Title

Untitled (man with fishing pole in front of small house)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7614

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man with fishing pole in front of small house)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7614

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.6
Human 99.6
Housing 95
Building 95
Vegetation 82.7
Bush 82.7
Plant 82.7
Outdoors 81.6
Nature 78
Land 75.5
Meal 74.6
Food 74.6
Woodland 74.2
Grove 74.2
Forest 74.2
Tree 74.2
Villa 70.4
House 70.4
Kiosk 60.5
Train 60.4
Vehicle 60.4
Transportation 60.4
Mobile Home 59.5
Cottage 58.6
Urban 55.3
Person 46.7

Clarifai
created on 2023-10-25

people 99.6
group 98.6
adult 97.6
print 97.4
illustration 96.4
many 96.3
man 95.2
home 93.8
monochrome 93.5
art 93.3
street 92.1
war 92
vehicle 91.9
woman 91.1
cavalry 89.3
vintage 87.6
group together 86.9
administration 85.1
crowd 84.2
painting 82.9

Imagga
created on 2022-01-08

structure 34.4
wheeled vehicle 25.8
billboard 24.9
building 22.5
old 20.9
car 20.5
travel 20.4
landscape 20.1
signboard 19.3
city 19.1
architecture 18.5
vehicle 17.8
snow 16.9
house 16.9
balcony 16.2
tree 15.6
water 15.3
winter 15.3
freight car 14.6
grunge 14.5
trees 14.2
vintage 14.1
sky 14
conveyance 14
black 13.8
frame 12.5
tourism 12.4
antique 12.1
paint 11.8
border 11.8
pattern 11.6
retro 11.5
rural 11.4
cold 11.2
home 11.2
texture 11.1
grungy 10.4
art 10.4
mobile home 10.4
dirty 9.9
scenery 9.9
history 9.8
collage 9.6
television 9.6
damaged 9.5
scene 9.5
park 9.4
space 9.3
housing 9.2
weather 9.2
rough 9.1
screen 9.1
vacation 9
window 8.8
scenic 8.8
urban 8.7
wall 8.7
edge 8.7
passenger car 8.6
season 8.6
construction 8.6
weathered 8.5
clouds 8.4
outdoor 8.4
monitor 8.4
town 8.3
trailer 8.3
negative 8.2
material 8
roof 8
computer 8
river 8
car mirror 7.9
noisy 7.9
tramway 7.9
design 7.9
designed 7.9
layered 7.9
mess 7.8
photographic 7.8
forest 7.8
frames 7.8
noise 7.8
scratch 7.8
film 7.8
messy 7.7
layer 7.7
rust 7.7
mask 7.7
historic 7.3
graphic 7.3
digital 7.3
streetcar 7.3
landmark 7.2
transportation 7.2
tower 7.2
night 7.1
mirror 7
country 7
track 7

Google
created on 2022-01-08

White 92.2
Plant 90.6
Tree 82.5
Rectangle 80.6
Adaptation 79.4
Building 76.6
Facade 75.9
Monochrome 74.3
Room 70.6
Window 70.4
Monochrome photography 69.5
Landscape 69.4
History 68.5
Art 68
Font 64.5
Cottage 63.6
Stock photography 62.4
Paper product 61.6
Visual arts 60
Illustration 55.6

Microsoft
created on 2022-01-08

tree 100
text 98.8
house 86.9
black and white 84.5
white 62.8
old 61.9
plant 52.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 99.9%
Sad 64.3%
Angry 6.7%
Fear 6.6%
Calm 6%
Surprised 5.6%
Disgusted 4.6%
Confused 3.4%
Happy 2.9%

AWS Rekognition

Age 40-48
Gender Male, 57.4%
Calm 91.7%
Happy 6.4%
Sad 0.4%
Surprised 0.4%
Confused 0.4%
Fear 0.3%
Disgusted 0.2%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.6%
Train 60.4%

Categories

Captions

Microsoft
created on 2022-01-08

a vintage photo of a person 87.8%
an old photo of a person 87.1%
a vintage photo of a train 60.5%

Text analysis

Amazon

KODVK
39845-A
KODVK C.VEEIX
C.VEEIX

Google

3984 5-A
3984
5-A