Human Generated Data

Title

Untitled (two women and a priest standing near model of a church)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7125

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two women and a priest standing near model of a church)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7125

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 98.9
Human 98.9
Person 98.4
Person 92.5
Clothing 79.1
Sleeve 79.1
Apparel 79.1
Text 73
Photo 61.9
Portrait 61.9
Face 61.9
Photography 61.9
People 61.9
Suit 58.1
Coat 58.1
Overcoat 58.1
Home Decor 56.4

Clarifai
created on 2023-10-15

people 99.4
man 97.2
adult 97.1
monochrome 94.6
woman 93.3
uniform 91
group 90
wear 86.1
administration 85.5
vertical 82.9
group together 78.4
indoors 77.3
leader 74.6
three 69.2
stump 67.4
side view 62.8
healthcare 62.8
horizontal 62.7
two 61.1
authority 59.7

Imagga
created on 2021-12-15

person 43.8
man 41.7
male 40.5
adult 37.3
people 34.1
professional 32.9
patient 32.2
teacher 30.3
happy 28.2
smiling 26.8
doctor 26.3
nurse 26.3
mature 25.1
hospital 25.1
medical 24.7
indoors 24.6
businessman 23
women 22.9
senior 22.5
men 22.3
business 21.9
colleagues 21.4
sitting 20.6
educator 20.1
office 20.1
couple 20
team 19.7
health 19.5
worker 19
room 18.9
day 18.8
job 18.6
lifestyle 18.1
modern 17.5
table 17.3
work 17.3
coat 17.3
casual 17
clinic 16.7
30s 16.4
portrait 16.2
teamwork 15.8
occupation 15.6
businesswoman 15.5
group 15.3
lab coat 15.3
businesspeople 15.2
meeting 15.1
working 15
elderly 14.4
talking 14.3
together 14
care 14
associates 13.8
corporate 13.8
two people 13.6
bright 13.6
medicine 13.2
waiter 13.2
smile 12.8
20s 12.8
home 12.8
40s 12.7
desk 12.3
sick person 12.1
camera 12
case 11.9
indoor 11.9
coworkers 11.8
illness 11.4
education 11.3
looking 11.2
clothing 10.8
discussion 10.7
retired 10.7
mid adult 10.6
daytime 10.6
color 10.6
dining-room attendant 10.6
human 10.5
standing 10.4
computer 10.4
life 10.2
treatment 10.1
laptop 10
suit 9.9
days 9.8
casual clothing 9.8
percussion instrument 9.8
to 9.7
technology 9.7
angle 9.6
adults 9.5
happiness 9.4
student 9.4
executive 9.1
old 9.1
surgeon 9
cheerful 8.9
business people 8.9
doctors 8.9
interior 8.9
discussing 8.8
60s 8.8
clinical 8.8
thirties 8.8
full length 8.7
light 8.7
laboratory 8.7
four 8.6
employee 8.6
face 8.5
two 8.5
clothes 8.4
hand 8.4
holding 8.3
healing 8
musical instrument 7.9
practitioner 7.9
gray hair 7.9
70s 7.9
conference 7.8
businessmen 7.8
lab 7.8
emotions 7.8
class 7.7
test 7.7
classroom 7.7
retirement 7.7
horizontal 7.5
focus 7.4
emotion 7.4
seller 7.2
science 7.1

Google
created on 2021-12-15

Shorts 88.4
Sleeve 87.2
Black-and-white 85.1
Style 83.8
Eyewear 79.9
Hat 79.8
Fashion design 74.6
Font 74.2
T-shirt 72.8
Monochrome photography 71.4
Monochrome 71.2
Event 70.5
Service 68.6
Retail 66.5
Room 62.5
Advertising 60.5
Brand 58.7
Photo caption 56
Vintage clothing 55.7
Uniform 55.3

Microsoft
created on 2021-12-15

text 98.4
man 94.8
person 92.5
clothing 89.7
standing 86.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Male, 61.3%
Calm 81.9%
Sad 7.5%
Surprised 5.1%
Confused 2.6%
Angry 1.2%
Happy 0.8%
Disgusted 0.4%
Fear 0.4%

AWS Rekognition

Age 50-68
Gender Male, 97.1%
Calm 99.6%
Surprised 0.1%
Sad 0.1%
Angry 0%
Happy 0%
Confused 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 26-40
Gender Male, 69.2%
Calm 87%
Happy 8.1%
Sad 1.9%
Confused 1.7%
Surprised 0.9%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 26-42
Gender Male, 87.8%
Calm 41.2%
Happy 18.5%
Surprised 18%
Fear 7%
Angry 5.2%
Sad 4.8%
Disgusted 3.4%
Confused 2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Categories

Imagga

paintings art 97.7%
text visuals 1.5%

Captions

Text analysis

Amazon

19621.
INSTITUTE
VS2

Google

19621. INSTITUTE 19621. MAMT2A3
19621.
INSTITUTE
MAMT2A3