Human Generated Data

Title

Untitled (group of kids in suits and dresses seated for portrait)

Date

c. 1940

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1625

Human Generated Data

Title

Untitled (group of kids in suits and dresses seated for portrait)

People

Artist: John Deusing, American active 1940s

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1625

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.3
Human 99.3
Person 99.1
Person 98.8
Person 98.4
Person 97.7
Person 97.6
Person 96.5
People 95.2
Person 93.6
Apparel 92.9
Clothing 92.9
Person 91.7
Person 91.7
Person 91.5
Person 91
Person 90.6
Person 89.2
Person 87.5
Poster 87.2
Advertisement 87.2
Person 81.4
Person 76.4
Family 75.8
Shorts 64.5
Female 61
Person 59.5
Crowd 56.9

Clarifai
created on 2023-10-15

people 100
group 99.9
child 99.6
many 99.4
education 97.5
boy 97.5
son 97.2
man 96.2
leader 95.9
group together 95.5
school 95.1
adult 94.8
woman 91.4
snapshot 88.6
wear 87.6
uniform 86.6
portrait 85.6
music 83.6
musician 83.1
centennial 82.9

Imagga
created on 2021-12-14

kin 100
people 24.5
wind instrument 13.3
brass 13.2
musical instrument 13
old 12.5
silhouette 12.4
group 12.1
men 12
man 11.4
male 11.3
couple 11.3
women 11.1
portrait 11
person 10.8
happiness 10.2
happy 10
vintage 9.9
religion 9.8
human 9.7
crowd 9.6
art 9.5
adult 9.3
style 8.9
smiling 8.7
love 8.7
scene 8.6
holiday 8.6
grunge 8.5
black 8.4
girls 8.2
water 8
smile 7.8
outdoor 7.6
dark 7.5
friendship 7.5
traditional 7.5
outdoors 7.5
sport 7.4
symbol 7.4
cheerful 7.3
child 7.2
body 7.2
cornet 7.2
night 7.1
face 7.1

Google
created on 2021-12-14

People 77.8
Art 73.4
Team 72.9
Vintage clothing 72.1
Crew 68.8
History 67.1
Uniform 66.1
Event 65.9
Classic 65.3
Illustration 63
Suit 61.6
Room 60.5
Font 60.5
Monochrome 59.4
Personal protective equipment 54.5
Painting 54.4
Visual arts 52.9
Recreation 52.8
Collection 51.8
Team sport 51

Microsoft
created on 2021-12-14

text 99.9
posing 98.3
person 97.9
old 96.6
clothing 95.3
window 94.3
group 84.7
man 81.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Male, 91.7%
Calm 87.9%
Surprised 5.1%
Sad 2.9%
Happy 1.4%
Fear 1.1%
Angry 0.8%
Confused 0.5%
Disgusted 0.3%

AWS Rekognition

Age 22-34
Gender Female, 72.6%
Calm 78.4%
Sad 18.6%
Happy 1.9%
Surprised 0.4%
Confused 0.4%
Angry 0.2%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 29-45
Gender Male, 78.9%
Happy 53.9%
Calm 35%
Sad 7.4%
Confused 1.4%
Surprised 1.3%
Angry 0.5%
Fear 0.2%
Disgusted 0.2%

AWS Rekognition

Age 20-32
Gender Male, 84.5%
Calm 87.3%
Happy 4.2%
Sad 3.3%
Surprised 2.8%
Confused 1.7%
Fear 0.3%
Angry 0.3%
Disgusted 0.1%

AWS Rekognition

Age 22-34
Gender Female, 68.1%
Calm 70.6%
Sad 13.3%
Happy 9.5%
Confused 4%
Surprised 1.1%
Angry 0.7%
Disgusted 0.5%
Fear 0.3%

AWS Rekognition

Age 42-60
Gender Male, 90.3%
Calm 86.1%
Confused 4.5%
Surprised 4.4%
Sad 2.7%
Angry 1.2%
Happy 0.4%
Fear 0.4%
Disgusted 0.3%

AWS Rekognition

Age 23-35
Gender Female, 55.4%
Calm 56.2%
Happy 37%
Surprised 2.1%
Confused 1.6%
Sad 1.4%
Angry 1.1%
Disgusted 0.3%
Fear 0.2%

AWS Rekognition

Age 23-37
Gender Male, 78%
Calm 65.9%
Happy 12.7%
Surprised 8.3%
Confused 5.1%
Fear 3.7%
Sad 3.2%
Angry 0.6%
Disgusted 0.6%

AWS Rekognition

Age 46-64
Gender Female, 52.5%
Calm 86.5%
Sad 9%
Happy 2.2%
Angry 0.8%
Confused 0.6%
Surprised 0.5%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 14-26
Gender Male, 68%
Happy 65.5%
Calm 28.4%
Sad 2%
Surprised 1.9%
Angry 0.9%
Confused 0.9%
Disgusted 0.3%
Fear 0.2%

AWS Rekognition

Age 48-66
Gender Female, 61.4%
Sad 54.5%
Calm 42.8%
Confused 1.4%
Fear 0.4%
Happy 0.3%
Surprised 0.3%
Angry 0.2%
Disgusted 0.1%

AWS Rekognition

Age 20-32
Gender Female, 63.7%
Calm 91.2%
Happy 3.3%
Surprised 2%
Angry 1.5%
Sad 0.9%
Confused 0.6%
Disgusted 0.4%
Fear 0.1%

AWS Rekognition

Age 22-34
Gender Male, 60.4%
Calm 91.8%
Happy 2.9%
Sad 2.5%
Surprised 1.4%
Confused 1.1%
Angry 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 23-35
Gender Male, 64.3%
Sad 76.3%
Calm 22.3%
Confused 0.6%
Surprised 0.3%
Angry 0.2%
Happy 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 48-66
Gender Female, 61.4%
Sad 46.3%
Calm 40.3%
Surprised 8%
Confused 2.5%
Angry 1.2%
Happy 1.1%
Disgusted 0.3%
Fear 0.3%

AWS Rekognition

Age 30-46
Gender Female, 54.8%
Sad 79.9%
Calm 18.5%
Confused 0.7%
Surprised 0.4%
Angry 0.2%
Fear 0.1%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 47-65
Gender Male, 92.9%
Calm 87.3%
Sad 6.6%
Happy 1.7%
Surprised 1.6%
Confused 1.6%
Angry 0.6%
Fear 0.4%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Poster 87.2%

Categories

Imagga

interior objects 95.6%
paintings art 3.1%

Text analysis

Amazon

FILM
АСГА NITRATE FILM
NITRATE
АСГА