Human Generated Data

Title

Untitled (wedding group in a park)

Date

1970s

People

Artist: Joel Meyerowitz, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3356

Copyright

© Joel Meyerowitz

Human Generated Data

Title

Untitled (wedding group in a park)

People

Artist: Joel Meyerowitz, American born 1938

Date

1970s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Clothing 99.7
Apparel 99.7
Shorts 99.7
Human 99.7
Person 99.7
Person 99.1
Person 99
Person 98.8
Tree 98.1
Plant 98.1
Person 96.5
Play 93.6
Grass 93.5
Person 93.3
Back 90.4
Vegetation 87.9
People 86.6
Fir 82.7
Abies 82.7
Person 74.2
Grove 72.1
Land 72.1
Woodland 72.1
Nature 72.1
Forest 72.1
Outdoors 72.1
Conifer 69.8
Sport 64.9
Sports 64.9
Child 60.8
Kid 60.8
Face 59.2
Team 56.5
Team Sport 56.5
Sphere 55.7

Imagga
created on 2022-01-22

swing 37.6
mechanical device 34.1
child 33.4
plaything 28.8
mechanism 25.4
sunset 22.5
park 21.4
outdoor 21.4
tree 20.9
people 20.6
silhouette 19.9
outdoors 18.1
sky 18
person 17.7
autumn 17.6
man 17.5
sport 17.4
forest 17.4
world 16.8
sun 16.1
grass 15.8
adult 15.6
male 14.6
fun 14.2
summer 14.2
landscape 14.1
boy 13.9
beach 13.5
trees 13.3
active 12.6
happy 12.5
portrait 12.3
light 12
black 12
water 12
love 11.8
field 11.7
lifestyle 11.6
couple 11.3
happiness 11
joy 10.9
free 10.3
dark 10
leisure 10
fall 10
recreation 9.9
kid 9.8
scene 9.5
play 9.5
walking 9.5
day 9.4
pedestrian 9.2
mother 9.2
peaceful 9.2
children 9.1
exercise 9.1
health 9
meadow 9
childhood 9
family 8.9
rural 8.8
scenic 8.8
woods 8.6
men 8.6
walk 8.6
outside 8.6
smile 8.6
two 8.5
travel 8.5
relax 8.4
old 8.4
ocean 8.3
parent 8.3
freedom 8.2
environment 8.2
countryside 8.2
girls 8.2
life 8.2
vacation 8.2
activity 8.1
natural 8
river 8
father 7.8
sea 7.8
season 7.8
space 7.8
dad 7.8
sunny 7.7
jumping 7.7
pretty 7.7
jump 7.7
kids 7.5
sunrise 7.5
evening 7.5
holding 7.4
road 7.2
scenery 7.2
athlete 7.1
women 7.1
spring 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

tree 100
outdoor 99.9
sky 99.2
person 90.7
text 87.1
clothing 86.8
black and white 72.5
water 72
man 60.7

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Female, 96.3%
Happy 99.5%
Calm 0.3%
Surprised 0.1%
Sad 0%
Confused 0%
Angry 0%
Disgusted 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a group of people standing next to a tree 92.9%
a group of people that are standing in the grass 91.6%
a group of people standing in a field 91.5%