Human Generated Data

Title

Untitled (group portrait in studio)

Date

c. 1945

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1843

Human Generated Data

Title

Untitled (group portrait in studio)

People

Artist: John Deusing, American active 1940s

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1843

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.6
Human 99.6
Person 99.4
Person 99.3
Person 99.3
Person 99.2
Person 98.8
Person 95.5
Person 93
Shoe 88.8
Apparel 88.8
Footwear 88.8
Clothing 88.8
Chef 63.2
People 55.4

Clarifai
created on 2023-10-15

people 99.6
group together 98.5
adult 97.6
group 97.5
man 96.7
many 96.3
uniform 90.4
woman 86.3
medical practitioner 83.7
five 83.2
several 82.8
wear 80.2
outfit 79.2
leader 76.7
four 76.3
education 73.2
position 69.9
facial expression 66.9
three 65.5
concentration 64.7

Imagga
created on 2021-12-14

person 43.5
nurse 40.7
man 36.3
player 36
male 34.8
golfer 32.8
people 30.7
contestant 26.4
happy 25.1
men 24
adult 23.1
professional 22
team 20.6
doctor 19.7
couple 19.2
business 18.8
smiling 18.1
group 17.7
businessman 17.6
medical 17.6
hospital 17.4
smile 17.1
job 16.8
uniform 16.3
portrait 16.2
standing 15.6
happiness 14.9
worker 14.8
lifestyle 14.4
health 13.9
black 13.8
two 13.5
teamwork 13
success 12.9
women 12.6
stethoscope 12.5
handsome 12.5
patient 12.5
medicine 12.3
together 12.3
boy 12.2
occupation 11.9
sport 11.9
love 11.8
confident 11.8
cheerful 11.4
athlete 11.2
suit 10.9
doctors 10.8
clinic 10.8
coat 10.7
care 10.7
family 10.7
attractive 10.5
looking 10.4
ballplayer 10.2
child 10.2
day 10.2
friendly 10.1
dark 10
kin 10
hand 9.9
human 9.7
partnership 9.6
successful 9.1
holding 9.1
husband 9.1
practice 8.7
mid adult 8.7
wife 8.5
employee 8.5
youth 8.5
pretty 8.4
mature 8.4
leisure 8.3
positive 8.3
one 8.2
businesswoman 8.2
guy 8.1
office 8
practitioner 8
work 7.8
lab coat 7.8
clinical 7.8
colleagues 7.8
partner 7.7
30s 7.7
healthy 7.6
friendship 7.5
fun 7.5
executive 7.4
world 7.3
active 7.3
girls 7.3
exercise 7.3
life 7.3
case 7.1
student 7.1
working 7.1

Google
created on 2021-12-14

White 92.2
Black 89.6
Curtain 86.5
Suit 78.3
Vintage clothing 73.9
Formal wear 73.2
Monochrome photography 72.8
Event 72.3
Monochrome 71.4
Uniform 71.1
Crew 67.3
Hat 66.1
Team 63.5
Photo caption 62.3
History 57.9
Room 55.9
Ceremony 54.6
Family 54.1
Fun 52.1
Photography 51.6

Microsoft
created on 2021-12-14

person 96.6
black and white 78.2
posing 74.1
player 73.5
clothing 62.8
man 55.1
dancer 54.8
clothes 16.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Female, 59.7%
Calm 43.6%
Surprised 37.5%
Confused 6.2%
Angry 4.5%
Fear 4%
Sad 3.1%
Happy 0.7%
Disgusted 0.3%

AWS Rekognition

Age 41-59
Gender Male, 98.5%
Calm 93.3%
Happy 3.3%
Surprised 1.2%
Sad 1%
Confused 0.5%
Fear 0.3%
Disgusted 0.2%
Angry 0.2%

AWS Rekognition

Age 22-34
Gender Female, 70.1%
Surprised 83.4%
Calm 12.6%
Happy 1.9%
Confused 0.7%
Fear 0.5%
Angry 0.5%
Disgusted 0.2%
Sad 0.1%

AWS Rekognition

Age 24-38
Gender Male, 96.7%
Calm 57.6%
Happy 31.3%
Sad 3.9%
Surprised 2.4%
Angry 2.4%
Confused 1.8%
Disgusted 0.4%
Fear 0.3%

AWS Rekognition

Age 22-34
Gender Female, 97.7%
Calm 55.5%
Happy 13.7%
Surprised 9.6%
Sad 8.3%
Fear 4.5%
Angry 3.4%
Confused 3.4%
Disgusted 1.5%

AWS Rekognition

Age 51-69
Gender Male, 98%
Calm 57.5%
Confused 18.8%
Happy 12.3%
Surprised 5.6%
Sad 3.3%
Disgusted 1.5%
Angry 0.8%
Fear 0.3%

AWS Rekognition

Age 22-34
Gender Male, 91.2%
Surprised 94.9%
Calm 3%
Fear 0.7%
Confused 0.6%
Happy 0.4%
Angry 0.3%
Disgusted 0.1%
Sad 0.1%

AWS Rekognition

Age 22-34
Gender Female, 84.8%
Surprised 49.8%
Calm 34.5%
Sad 5.1%
Fear 4.3%
Happy 3.3%
Angry 1.8%
Confused 1%
Disgusted 0.3%

AWS Rekognition

Age 22-34
Gender Female, 90.5%
Calm 89.6%
Surprised 6%
Happy 2.1%
Sad 1.1%
Confused 0.5%
Angry 0.4%
Disgusted 0.3%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 88.8%

Categories

Imagga

people portraits 99.4%

Text analysis

Amazon

191
ill
IN