Human Generated Data

Title

Untitled (girl doing gymnastics for audience)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7706

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (girl doing gymnastics for audience)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7706

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 97.5
Human 97.5
Person 96.9
Person 93.7
Indoors 93.5
Sunglasses 93
Accessories 93
Accessory 93
Interior Design 92.4
Person 91.2
Room 90.9
Person 89.4
Person 87.2
Person 86.5
Water 80.5
Flooring 78.1
Shorts 77.4
Clothing 77.4
Apparel 77.4
Floor 73.3
Person 69.8
Furniture 69.4
Fitness 68.4
Sport 68.4
Sports 68.4
Working Out 68.4
Exercise 68.4
Housing 67.1
Building 67.1
Chair 66.6
Person 64.9
Bedroom 64.3
Female 64
Person 63.5
Swimwear 61.5
Outdoors 61
Leisure Activities 59.3
Lobby 59.3
Back 56.3
Girl 55.3

Clarifai
created on 2023-10-25

people 99.8
man 96.6
adult 96.6
woman 96.1
one 94.1
indoors 94.1
monochrome 93.4
wear 92.6
many 92.3
group together 89.4
young 88.3
exercise 86.9
two 86.6
recreation 86.3
group 84.6
athlete 83.5
fun 80.6
swimming pool 79.1
sports equipment 78.5
child 77.6

Imagga
created on 2022-01-09

people 24.5
person 23.5
man 22.2
adult 20.2
portrait 16.2
sexy 16.1
water 15.3
hair 15.1
fashion 14.3
city 14.1
model 14
urban 14
happy 13.8
men 13.7
male 13.7
human 12.7
attractive 12.6
standing 12.2
business 12.2
pretty 11.9
world 11.5
walk 11.4
black 10.9
lifestyle 10.8
wet 10.7
wall 10.6
lady 10.5
indoors 10.5
body 10.4
summer 10.3
clothing 10.2
skin 10.2
fountain 9.9
snow 9.8
businessman 9.7
cleaner 9.7
newspaper 9.6
women 9.5
walking 9.5
life 9.4
work 9.4
smiling 9.4
outdoors 9.4
smile 9.3
travel 9.2
garment 9.1
building 9.1
one 9
posing 8.9
covering 8.8
tourist 8.4
outdoor 8.4
health 8.3
vacation 8.2
sensuality 8.2
style 8.2
success 8.1
worker 8
swimsuit 7.7
youth 7.7
protective covering 7.6
casual 7.6
bath 7.6
relaxation 7.5
house 7.5
manager 7.5
heat 7.4
product 7.4
street 7.4
alone 7.3
cute 7.2
recreation 7.2
face 7.1
love 7.1
happiness 7.1
swimming trunks 7

Google
created on 2022-01-09

Flash photography 86.5
Black-and-white 86
Grey 84.3
Style 84
Wood 83.6
Building 83.1
Flooring 82.8
Floor 82.2
Window 82
Line 81.7
Wall 81.2
Dance 78.5
Shorts 78.4
Monochrome photography 75.8
Monochrome 75.4
Knee 74.5
Balance 73
Event 71.2
T-shirt 71
Hardwood 70.1

Microsoft
created on 2022-01-09

outdoor 92.3
text 92.1
black and white 85.4
clothing 81.4
person 80.4
wedding dress 78.5
dance 52.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 62%
Disgusted 25.4%
Sad 21.6%
Fear 21.3%
Calm 15.8%
Confused 8.2%
Angry 4.6%
Surprised 1.6%
Happy 1.4%

AWS Rekognition

Age 10-18
Gender Male, 72.7%
Calm 73.7%
Sad 11.7%
Happy 7.5%
Fear 2.3%
Angry 1.7%
Disgusted 1.2%
Surprised 1.2%
Confused 0.7%

AWS Rekognition

Age 20-28
Gender Male, 61.7%
Calm 92.4%
Sad 3.6%
Confused 2%
Surprised 0.6%
Happy 0.5%
Fear 0.4%
Angry 0.3%
Disgusted 0.2%

AWS Rekognition

Age 26-36
Gender Female, 62.9%
Calm 63.9%
Sad 12.8%
Disgusted 8.2%
Angry 6.5%
Fear 4%
Happy 2.7%
Confused 1%
Surprised 1%

AWS Rekognition

Age 22-30
Gender Male, 89.6%
Sad 64.9%
Happy 25.6%
Fear 2.5%
Disgusted 1.7%
Calm 1.5%
Confused 1.5%
Angry 1.4%
Surprised 0.8%

AWS Rekognition

Age 22-30
Gender Female, 52.6%
Happy 82.4%
Calm 14%
Sad 1%
Angry 0.6%
Fear 0.6%
Confused 0.5%
Surprised 0.5%
Disgusted 0.4%

AWS Rekognition

Age 22-30
Gender Male, 78.2%
Calm 85.5%
Sad 8.9%
Happy 2.2%
Confused 1.4%
Angry 0.8%
Surprised 0.5%
Fear 0.4%
Disgusted 0.3%

AWS Rekognition

Age 21-29
Gender Male, 88.9%
Calm 55.3%
Happy 20.2%
Sad 12.5%
Confused 8.1%
Angry 1.4%
Disgusted 0.9%
Surprised 0.8%
Fear 0.8%

AWS Rekognition

Age 16-24
Gender Female, 79.2%
Calm 82%
Happy 8.9%
Sad 5.5%
Angry 1.5%
Confused 0.8%
Fear 0.6%
Disgusted 0.4%
Surprised 0.2%

AWS Rekognition

Age 13-21
Gender Male, 81.9%
Confused 43.8%
Sad 30.1%
Calm 9.1%
Angry 5.2%
Fear 4.1%
Disgusted 3.1%
Surprised 2.3%
Happy 2.2%

AWS Rekognition

Age 21-29
Gender Male, 81%
Calm 83.1%
Confused 7.1%
Sad 5.4%
Happy 1.5%
Surprised 1%
Fear 0.7%
Angry 0.7%
Disgusted 0.6%

AWS Rekognition

Age 20-28
Gender Male, 82.8%
Sad 52.3%
Calm 31.6%
Happy 4.7%
Confused 4.5%
Angry 2%
Fear 1.8%
Surprised 1.8%
Disgusted 1.3%

AWS Rekognition

Age 24-34
Gender Female, 57.2%
Calm 98.7%
Sad 0.8%
Disgusted 0.2%
Fear 0.1%
Angry 0.1%
Happy 0%
Surprised 0%
Confused 0%

AWS Rekognition

Age 25-35
Gender Male, 95.4%
Happy 33%
Sad 29.9%
Calm 27.1%
Confused 3.3%
Disgusted 2.9%
Angry 1.7%
Fear 1.1%
Surprised 1.1%

AWS Rekognition

Age 18-24
Gender Male, 65.2%
Sad 50.2%
Surprised 23.4%
Calm 12.4%
Fear 7.7%
Angry 2.4%
Disgusted 2.4%
Happy 1%
Confused 0.6%

AWS Rekognition

Age 24-34
Gender Male, 91.9%
Calm 69.8%
Sad 14.1%
Happy 10.9%
Confused 2%
Angry 1.4%
Disgusted 0.8%
Fear 0.5%
Surprised 0.5%

AWS Rekognition

Age 20-28
Gender Male, 51.5%
Calm 85.9%
Sad 5.2%
Happy 3.5%
Confused 2.6%
Angry 1%
Fear 0.6%
Surprised 0.6%
Disgusted 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 97.5%

Categories

Imagga

interior objects 98.9%

Text analysis

Amazon

as
28469.

Google

VT33A2- YAGON 28469
VT33A2-
YAGON
28469