Human Generated Data

Title

Untitled (children standing at a window)

Date

c. 1920

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7764

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (children standing at a window)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1920

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7764

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Face 99.6
Human 99.6
Person 99.2
Boy 98.7
Smile 98.6
Person 98.6
Person 98.5
Person 98.3
Person 98
Person 97.5
Clothing 94.1
Apparel 94.1
Person 92.1
Kid 92
Child 92
Head 90.3
Female 88.1
Glasses 85.4
Accessories 85.4
Accessory 85.4
Meal 76.7
Food 76.7
Portrait 75.6
Photography 75.6
Photo 75.6
Outdoors 75
Girl 74.4
People 69.5
Vehicle 66.8
Transportation 66.8
Nature 65.8
Tree 62.6
Plant 62.6
Play 59.4
Housing 59.1
Building 59.1
Water 58.5
Cream 57.8
Dessert 57.8
Creme 57.8
Teeth 56.4
Mouth 56.4
Lip 56.4
Teen 55.5

Clarifai
created on 2023-10-25

people 99.9
group 99.8
child 99.8
son 98.5
boy 96.9
sibling 95.6
man 93.4
portrait 93.2
movie 92.5
several 91.7
art 91.5
family 91.1
three 91.1
five 90.2
adult 90.2
retro 88.5
wear 87.9
woman 86.7
four 86.3
music 84.9

Imagga
created on 2022-01-09

television 100
telecommunication system 87
broadcasting 59
telecommunication 43.5
man 34.2
computer 30.5
medium 28.3
male 26.9
laptop 25.9
smiling 25.3
senior 24.3
happy 22.5
people 21.7
office 21.7
person 21.2
monitor 21
business 20.6
adult 20
technology 19.3
sitting 18.9
screen 18
couple 17.4
portrait 16.2
businessman 15.9
smile 14.2
desk 14.2
working 14.1
modern 14
together 14
communication 12.6
work 12.5
equipment 12.3
car 12
looking 12
elderly 11.5
wife 11.4
display 11.3
keyboard 11.3
home 11.2
professional 11
driver 10.7
old 10.4
vehicle 10.3
notebook 10.3
mature 10.2
husband 9.5
men 9.4
businesswoman 9.1
group 8.9
success 8.8
object 8.8
table 8.6
corporate 8.6
finance 8.4
electronic 8.4
student 8.1
gray 8.1
new 8.1
family 8
information 8
hair 7.9
sixties 7.8
60s 7.8
education 7.8
portable 7.8
driving 7.7
married 7.7
automobile 7.7
two 7.6
studio 7.6
enjoying 7.6
drive 7.6
learning 7.5
fun 7.5
camera 7.4
vacation 7.4
window 7.3
indoor 7.3
room 7.3
lifestyle 7.2
suit 7.2
holiday 7.2
women 7.1
face 7.1
love 7.1
indoors 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98.1
window 98
human face 96.3
old 87.7
posing 86.6
person 86.3
smile 85
man 78.5
black 68.9
clothing 67.6
gallery 63.3
image 41.6
picture frame 16.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 99.4%
Happy 98.1%
Calm 0.7%
Confused 0.7%
Sad 0.2%
Disgusted 0.1%
Surprised 0.1%
Fear 0.1%
Angry 0%

AWS Rekognition

Age 7-17
Gender Male, 98.8%
Happy 83.9%
Calm 12.7%
Surprised 1.1%
Sad 0.9%
Confused 0.8%
Disgusted 0.4%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 30-40
Gender Female, 75.1%
Happy 86.8%
Calm 8.3%
Surprised 1.6%
Disgusted 0.9%
Fear 0.9%
Sad 0.6%
Angry 0.5%
Confused 0.3%

AWS Rekognition

Age 25-35
Gender Male, 75.8%
Happy 63.6%
Calm 33.6%
Sad 1.4%
Confused 0.5%
Disgusted 0.3%
Fear 0.2%
Surprised 0.2%
Angry 0.1%

AWS Rekognition

Age 19-27
Gender Female, 97.9%
Happy 72.8%
Surprised 11.9%
Sad 5.5%
Calm 5.1%
Fear 1.7%
Disgusted 1.4%
Angry 1.1%
Confused 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Text analysis

Amazon

24526.
VAGOY
VI77A2

Google

24526.
24526.