Human Generated Data

Title

Untitled (costumed couple walking under the raised swords of guards)

Date

1947

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5508

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (costumed couple walking under the raised swords of guards)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Water 99.9
Outdoors 99.7
Human 99.2
Fishing 99.2
Person 98.9
Person 98.9
Person 98.8
Person 98.7
Person 97.5
Leisure Activities 95.1
Angler 95.1
Nature 71.3
Person 69.5
Person 59.9

Imagga
created on 2022-01-23

silhouette 38.9
people 37.4
crowd 35.5
team 34
teamwork 32.4
male 31.2
man 27.6
businessman 27.4
person 26.2
business 24.9
audience 23.4
group 23.4
job 22.1
businesswoman 21.8
symbol 21.5
nighttime 21.5
outfit 21.4
stadium 21.4
leader 21.2
work 21.2
occupation 21.1
vibrant 21
flag 20.7
president 20.6
men 20.6
cheering 20.6
design 20.2
patriotic 20.1
nation 19.9
supporters 19.8
speech 19.6
presentation 19.5
lights 19.5
boss 19.1
vivid 18.6
icon 18.2
bright 17.9
sexy 17.7
boutique 13.6
couple 12.2
black 10.8
wind instrument 10.7
dance 10.5
human 10.5
friendship 10.3
women 10.3
musical instrument 9.6
professional 9.5
cross 9.4
document 9.3
art 9.2
negative 9.2
adult 9.2
suit 9.1
meeting 8.5
sport 8.2
picket fence 8
boy 7.8
fashion 7.5
fun 7.5
painting 7.2
family 7.1
worker 7.1
together 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 97.5
posing 83.1

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Male, 84.5%
Happy 98.8%
Sad 0.6%
Confused 0.1%
Angry 0.1%
Calm 0.1%
Surprised 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 25-35
Gender Female, 72.3%
Calm 70.6%
Sad 19.2%
Confused 7.7%
Angry 0.8%
Surprised 0.5%
Happy 0.5%
Disgusted 0.3%
Fear 0.3%

AWS Rekognition

Age 29-39
Gender Male, 98.5%
Sad 69.4%
Calm 9.6%
Happy 4.4%
Surprised 4.1%
Fear 4.1%
Confused 3.8%
Angry 2.6%
Disgusted 2.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Captions

Microsoft

a group of people posing for a photo 82.9%
a group of people posing for a picture 82.8%
a group of people posing for the camera 82.7%

Text analysis

Amazon

22643
022MA
K.M.I.

Google

22643
22643