Human Generated Data

Title

Untitled (couple walking on sidewalk, Florasota Gardens)

Date

1953

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8730

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple walking on sidewalk, Florasota Gardens)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8730

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.6
Human 99.6
Person 99.6
Clothing 94.7
Apparel 94.7
Car 92.8
Transportation 92.8
Vehicle 92.8
Automobile 92.8
Shorts 90.1
Shoe 89.7
Footwear 89.7
People 82.5
Tree 78.6
Plant 78.6
Car 77.3
Female 74.2
Tarmac 73.5
Asphalt 73.5
Pedestrian 72.8
Road 70.2
Working Out 69.8
Exercise 69.8
Sport 69.8
Sports 69.8
Walking 67.8
Fitness 66.7
Suit 59.4
Coat 59.4
Overcoat 59.4
Woman 58.4
Face 58.3
Photography 58.2
Photo 58.2
Sleeve 55.1
Shoe 55

Clarifai
created on 2023-10-25

people 99.9
two 98.3
adult 98.1
group together 97.8
man 97.4
woman 96.9
monochrome 91.7
recreation 91.2
child 91.1
sports equipment 90.8
wear 90.8
group 90.3
several 87.8
four 87.2
three 86.4
street 85.6
home 83.2
outfit 81
adolescent 80.6
leader 80

Imagga
created on 2022-01-09

swing 99.5
mechanical device 72.8
plaything 72.5
mechanism 54.2
crutch 33.4
staff 26.6
park 26.3
outdoors 24
stick 21.6
outdoor 20.6
child 19.9
people 17.3
day 17.3
happy 16.3
walking 16.1
summer 16.1
adult 15.5
tree 15.4
fun 15
sport 14.9
pedestrian 14.4
man 13.4
family 13.3
happiness 13.3
person 13
lifestyle 13
smiling 12.3
couple 12.2
male 12.2
play 12.1
street 12
recreation 11.7
portrait 11.6
childhood 11.6
city 11.6
smile 11.4
playground 10.8
active 10.8
vacation 10.6
pretty 10.5
sunny 10.3
outside 10.3
youth 10.2
resort area 10.1
playing 10
leisure 10
travel 9.9
activity 9.9
standing 9.6
walk 9.5
joy 9.2
children 9.1
landscape 8.9
kid 8.9
area 8.7
boy 8.7
senior 8.4
sun 8
women 7.9
grass 7.9
cute 7.9
wheelchair 7.7
sitting 7.7
attractive 7.7
old 7.7
building 7.3
cheerful 7.3
lady 7.3
teenager 7.3
exercise 7.3
road 7.2
love 7.1
equipment 7.1
sky 7
together 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 97.6
outdoor 95.8
tree 91.3
clothing 90
person 88.8
footwear 80.5
man 79.4
black and white 65.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Male, 99.9%
Calm 99.2%
Happy 0.4%
Confused 0.1%
Sad 0.1%
Surprised 0.1%
Angry 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 38-46
Gender Male, 86.4%
Calm 92%
Sad 3.7%
Happy 3.1%
Confused 0.4%
Angry 0.3%
Disgusted 0.3%
Surprised 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Car 92.8%
Shoe 89.7%

Categories

Text analysis

Amazon

BED
38329.
YT3-X

Google

BED
BED