Human Generated Data

Title

Untitled (nine children posed sitting in front of fireplace decorated for Christmas)

Date

1970

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10132

Human Generated Data

Title

Untitled (nine children posed sitting in front of fireplace decorated for Christmas)

People

Artist: Martin Schweig, American 20th century

Date

1970

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10132

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Person 99.1
Human 99.1
Person 97.5
Person 96.5
People 95.6
Person 95.5
Person 95.3
Person 93.5
Person 91.8
Person 90.1
Stage 85
Person 80.2
Living Room 76.3
Indoors 76.3
Room 76.3
Female 76.3
Crowd 74.9
Face 73.7
Family 69
Girl 68.1
Person 68.1
Curtain 65.2
Furniture 64.6
Couch 61.3
Text 61.2
Portrait 58.9
Photography 58.9
Photo 58.9
Kid 56.3
Child 56.3
Clothing 56
Apparel 56
Monitor 55.1
Electronics 55.1
Screen 55.1
Display 55.1

Clarifai
created on 2023-10-27

people 99.9
adult 98
group 97.3
woman 96.5
wear 95.8
man 94.9
group together 93.6
child 92.8
many 92.5
education 89.4
room 88
music 87.4
leader 86.6
furniture 86.3
one 84.5
sit 84.3
administration 83.6
audience 83.5
indoors 80.5
musician 79.7

Imagga
created on 2022-01-28

classroom 30.2
room 28.7
blackboard 26.2
building 19.2
architecture 16.4
freight car 16.1
people 14.5
cinema 13.5
history 13.4
old 13.2
man 12.8
theater 12.8
interior 12.4
car 12.1
structure 12.1
travel 11.9
historical 11.3
art 10.8
black 10.8
vintage 10.7
tourism 10.7
water 10.7
wheeled vehicle 10.1
house 10.1
landmark 9.9
scene 9.5
city 9.1
business 9.1
urban 8.7
shop 8.7
window 8.6
culture 8.5
monument 8.4
famous 8.4
screen 8.3
inside 8.3
historic 8.2
indoor 8.2
tourist 8.1
chair 8.1
hall 7.9
office 7.9
male 7.8
education 7.8
antique 7.8
altar 7.7
church 7.4
light 7.3
teacher 7.2
home 7.2
religion 7.2
school 7.1
women 7.1
person 7.1
indoors 7

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

person 95
text 94.9
indoor 89.5
dance 80
picture frame 7.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Female, 52.2%
Sad 77.6%
Calm 10.7%
Confused 3.7%
Disgusted 2.4%
Angry 2.1%
Fear 1.2%
Surprised 1.2%
Happy 1.1%

AWS Rekognition

Age 26-36
Gender Male, 99.9%
Sad 66.8%
Confused 21.4%
Calm 3.5%
Angry 3.3%
Disgusted 2.2%
Happy 1%
Surprised 0.9%
Fear 0.7%

AWS Rekognition

Age 24-34
Gender Female, 52.7%
Calm 48.7%
Sad 28.2%
Happy 9.5%
Confused 5%
Fear 4.9%
Surprised 2.3%
Disgusted 0.7%
Angry 0.7%

AWS Rekognition

Age 48-56
Gender Male, 97.6%
Happy 67.2%
Confused 10.4%
Sad 7.4%
Disgusted 4.3%
Calm 3.6%
Angry 3%
Surprised 2.9%
Fear 1.2%

AWS Rekognition

Age 23-31
Gender Male, 99.6%
Calm 35.3%
Sad 28.2%
Fear 12%
Angry 8%
Confused 7.9%
Disgusted 5.7%
Happy 1.4%
Surprised 1.4%

AWS Rekognition

Age 54-64
Gender Male, 99.8%
Sad 84.2%
Happy 7.3%
Calm 4.8%
Disgusted 1.3%
Confused 1.2%
Angry 0.5%
Fear 0.5%
Surprised 0.4%

AWS Rekognition

Age 40-48
Gender Male, 99.9%
Sad 79.6%
Calm 14.8%
Disgusted 2.2%
Confused 2.1%
Fear 0.6%
Angry 0.3%
Happy 0.2%
Surprised 0.2%

AWS Rekognition

Age 29-39
Gender Male, 97.7%
Calm 97.9%
Sad 0.8%
Fear 0.4%
Surprised 0.3%
Confused 0.3%
Angry 0.1%
Happy 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.1%
Person 97.5%
Person 96.5%
Person 95.5%
Person 95.3%
Person 93.5%
Person 91.8%
Person 90.1%
Person 80.2%
Person 68.1%

Categories

Text analysis

Amazon

8
KODAK
FILM
JULIE
KODAK SAFETY
SAFETY
Bette

Google

KODAK SAFETY FILM FILM KODAK SAFETY
KODAK
SAFETY
FILM