Human Generated Data

Title

Untitled (couple with young children next to Christmas tree)

Date

1958

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8045

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple with young children next to Christmas tree)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1958

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8045

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.7
Human 99.7
Person 99
Clothing 97.2
Apparel 97.2
Person 96.3
Person 95.8
People 91.9
Face 90
Pants 89.4
Chair 87.7
Furniture 87.7
Tree 85.1
Plant 85.1
Female 84.7
Woman 70.9
Kid 70.9
Child 70.9
Girl 70.8
Door 68.7
Teen 67.5
Family 65.2
Photography 64.3
Photo 64.3
Jeans 62.3
Denim 62.3
Urban 61.3
Text 60.1

Clarifai
created on 2023-10-26

people 99
monochrome 97.7
family 94.2
room 93.3
man 93.1
woman 91.1
street 90.4
adult 90.2
two 88.8
home 88.5
house 87.7
window 86.6
chair 85.9
indoors 83.3
furniture 79.1
couple 77.5
one 77.2
child 75.7
group together 75.3
elderly 74.2

Imagga
created on 2022-01-15

person 26.5
man 20.1
people 18.4
adult 15.1
male 14.2
room 13.2
women 12.6
blackboard 12.3
planner 12.2
lifestyle 11.6
sketch 11.3
pretty 11.2
building 11.2
portrait 11
business 10.9
drawing 10.7
hand 10.6
businessman 10.6
work 10.5
device 10.2
crutch 10.1
smiling 10.1
happy 10
city 10
silhouette 9.9
attractive 9.8
standing 9.6
men 9.4
sport 9.4
day 9.4
office 9.2
alone 9.1
indoor 9.1
window 9.1
human 9
chair 9
one 8.9
working 8.8
equipment 8.7
black 8.5
casual 8.5
life 8.4
old 8.4
worker 8.1
success 8
job 8
staff 7.8
smile 7.8
face 7.8
sax 7.7
sitting 7.7
power 7.5
holding 7.4
water 7.3
looking 7.2
cute 7.2
transportation 7.2
hair 7.1
interior 7.1
architecture 7
indoors 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 95.6
outdoor 89.3
person 70.5
drawing 53.5
clothing 51.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 88.8%
Happy 64.8%
Sad 19.7%
Surprised 12.2%
Disgusted 1%
Angry 0.8%
Fear 0.6%
Confused 0.4%
Calm 0.4%

AWS Rekognition

Age 30-40
Gender Female, 61.5%
Sad 71.3%
Happy 10%
Calm 8.8%
Confused 5%
Disgusted 1.4%
Fear 1.4%
Angry 1.1%
Surprised 1%

AWS Rekognition

Age 37-45
Gender Male, 98.1%
Calm 98.2%
Sad 0.7%
Happy 0.6%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%
Surprised 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Categories

Text analysis

Amazon

43754.
:
MJ17--Y137--

Google

43754.
43754.