Human Generated Data

Title

Untitled (nine children posed sitting in front of fireplace decorated for Christmas)

Date

1970

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10191

Human Generated Data

Title

Untitled (nine children posed sitting in front of fireplace decorated for Christmas)

People

Artist: Martin Schweig, American 20th century

Date

1970

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10191

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.3
Human 99.3
Person 99.3
Person 99.1
Person 99
Person 98.8
Person 98.7
Person 98.4
Person 97.5
Person 96.8
People 90.8
Person 86.6
Tie 83.4
Accessories 83.4
Accessory 83.4
Indoors 75.2
Room 75.2
Family 68.2
Living Room 64.2

Clarifai
created on 2023-10-26

people 99.9
group 99.7
group together 98.1
many 97.4
adult 97
man 96.8
woman 94.7
child 94.6
several 91
five 88.4
wear 88.2
leader 87.9
recreation 87.6
family 86.8
boy 85.1
administration 84.9
music 82.6
four 82.5
furniture 81.8
adolescent 78.1

Imagga
created on 2022-01-22

man 20.2
people 19.5
person 17.8
musical instrument 16.7
city 15.8
brass 14.9
black 13.9
room 13.2
urban 13.1
wind instrument 13
classroom 12.2
building 12.1
dress 11.7
male 11.4
decoration 11.3
stringed instrument 11.2
portrait 11
travel 10.5
group 10.5
shop 10.4
adult 10.4
business 10.3
grunge 10.2
silhouette 9.9
old 9.7
crowd 9.6
transportation 9
interior 8.8
holiday 8.6
architecture 8.6
men 8.6
vintage 8.3
case 8.3
bowed stringed instrument 8.2
art 8.2
lady 8.1
team 8.1
sexy 8
station 8
celebration 8
violin 7.9
sport 7.9
motion 7.7
elegant 7.7
train 7.7
shoe shop 7.5
fashion 7.5
journey 7.5
happy 7.5
gymnasium 7.4
window 7.4
speed 7.3
suit 7.2
home 7.2
working 7.1
businessman 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 90.2
clothing 88.3
posing 65.8
human face 59.7
woman 58.7
smile 52.3
altar 25

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 31-41
Gender Female, 100%
Happy 98.8%
Calm 0.7%
Surprised 0.1%
Sad 0.1%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%
Fear 0%

AWS Rekognition

Age 25-35
Gender Female, 98.6%
Happy 99.7%
Calm 0.1%
Surprised 0.1%
Angry 0.1%
Confused 0%
Sad 0%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 6-12
Gender Male, 100%
Happy 99.3%
Calm 0.3%
Surprised 0.1%
Confused 0.1%
Angry 0%
Sad 0%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 6-16
Gender Female, 99%
Calm 99.5%
Surprised 0.1%
Angry 0.1%
Sad 0.1%
Fear 0.1%
Confused 0.1%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 0-6
Gender Female, 98.8%
Calm 97.8%
Angry 0.5%
Confused 0.5%
Surprised 0.4%
Sad 0.3%
Disgusted 0.2%
Fear 0.2%
Happy 0.1%

AWS Rekognition

Age 21-29
Gender Female, 99.9%
Happy 99.7%
Calm 0.1%
Surprised 0.1%
Angry 0%
Disgusted 0%
Confused 0%
Sad 0%
Fear 0%

AWS Rekognition

Age 2-8
Gender Male, 97.7%
Surprised 60.6%
Fear 23%
Calm 11.9%
Confused 2.4%
Angry 0.7%
Disgusted 0.6%
Sad 0.5%
Happy 0.4%

AWS Rekognition

Age 7-17
Gender Male, 88.5%
Happy 99.4%
Calm 0.3%
Confused 0.1%
Surprised 0.1%
Angry 0.1%
Sad 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 2-8
Gender Female, 99.6%
Happy 99.9%
Surprised 0%
Angry 0%
Calm 0%
Disgusted 0%
Sad 0%
Fear 0%
Confused 0%

Microsoft Cognitive Services

Age 2
Gender Female

Microsoft Cognitive Services

Age 5
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Tie 83.4%

Categories

Text analysis

Amazon

PROOF
Exttr
JULiE

Google

JULIE Betts PROOF
JULIE
Betts
PROOF