Human Generated Data

Title

Untitled (nine children posed sitting in front of fireplace decorated for Christmas)

Date

1970

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10129

Human Generated Data

Title

Untitled (nine children posed sitting in front of fireplace decorated for Christmas)

People

Artist: Martin Schweig, American 20th century

Date

1970

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10129

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Person 98.8
Human 98.8
Person 97.8
Person 96.6
Person 96.3
Person 95.9
Person 95.7
Room 94.6
Indoors 94.6
Person 94.5
Person 93.8
Interior Design 91.8
People 91.3
Person 89.1
Living Room 75.7
Stage 67.4
Text 67.2
Furniture 66.1
Crowd 65.3
Female 65.2
Curtain 65
Face 63.1
Portrait 61.9
Photography 61.9
Photo 61.9
Girl 59.6
Family 55.5

Clarifai
created on 2023-10-27

people 99.9
adult 97.5
group 97.4
wear 96.1
woman 96.1
child 95.9
group together 95.5
man 94.7
room 94.6
furniture 92.9
education 90.5
administration 88.6
sit 88.5
boy 87.2
family 85.5
many 85.4
leader 85.2
home 84
indoors 82.8
one 81.8

Imagga
created on 2022-01-28

freight car 100
car 100
wheeled vehicle 89.9
vehicle 56.6
conveyance 29.9
old 17.4
wall 17.1
window 16.7
building 15.9
grunge 15.3
black 13.8
architecture 13.3
art 13
city 12.5
pattern 12.3
texture 11.8
dirty 11.7
urban 11.3
people 11.1
business 10.9
shop 10.8
decoration 10.7
interior 10.6
grungy 10.4
antique 10.4
silhouette 9.9
vintage 9.9
retro 9.8
barbershop 9.7
room 9.6
space 9.3
dark 9.2
frame 9.1
decor 8.8
historical 8.5
travel 8.4
design 8.4
street 8.3
inside 8.3
paint 8.1
aged 8.1
copy space 8.1
history 8
light 8
night 8
scene 7.8
empty 7.7
house 7.5
style 7.4
transport 7.3
indoor 7.3
blackboard 7.3
border 7.2
transportation 7.2

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

indoor 86.5
text 85.8
person 78.3
candle 63.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 64.3%
Calm 35.9%
Sad 33.8%
Disgusted 12%
Surprised 8.6%
Confused 5.7%
Angry 1.7%
Happy 1.5%
Fear 0.9%

AWS Rekognition

Age 43-51
Gender Male, 98.9%
Calm 92.6%
Confused 1.8%
Angry 1.7%
Surprised 1.5%
Happy 0.9%
Sad 0.6%
Disgusted 0.6%
Fear 0.3%

AWS Rekognition

Age 38-46
Gender Male, 99.2%
Sad 75.6%
Calm 20.9%
Happy 1.5%
Confused 0.6%
Fear 0.6%
Angry 0.3%
Surprised 0.2%
Disgusted 0.2%

AWS Rekognition

Age 42-50
Gender Male, 95.5%
Sad 87.7%
Calm 5.2%
Confused 2%
Disgusted 1.8%
Happy 1.2%
Fear 0.7%
Angry 0.7%
Surprised 0.6%

AWS Rekognition

Age 40-48
Gender Male, 99.4%
Happy 54.3%
Sad 12.6%
Calm 8.3%
Disgusted 8.1%
Confused 6.1%
Angry 5.5%
Surprised 3.5%
Fear 1.6%

AWS Rekognition

Age 31-41
Gender Female, 96.2%
Sad 41.4%
Fear 19.1%
Calm 16.5%
Happy 10%
Surprised 8.5%
Confused 1.8%
Angry 1.6%
Disgusted 1.2%

AWS Rekognition

Age 41-49
Gender Male, 98.9%
Sad 28.6%
Calm 20.8%
Disgusted 19.8%
Happy 12.5%
Surprised 9.5%
Confused 4.5%
Angry 3.4%
Fear 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.8%
Person 97.8%
Person 96.6%
Person 96.3%
Person 95.9%
Person 95.7%
Person 94.5%
Person 93.8%
Person 89.1%

Categories

Text analysis

Amazon

FILM
JULIE
n
ODAK
КОПАК
ODAK SARETY
SARETY
E
better
SAFETY

Google

ODAK AFE ILM
ODAK
AFE
ILM