Human Generated Data

Title

Untitled (four women standing behind buffet table at fancy event)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9262

Human Generated Data

Title

Untitled (four women standing behind buffet table at fancy event)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9262

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.7
Person 99.7
Person 99.6
Person 98.7
Person 97.1
People 93.8
Tablecloth 90.9
Crowd 85.9
Text 84.7
Face 78.5
Plant 75.2
Photo 69.8
Photography 69.8
Portrait 68
Blossom 65.8
Flower 65.8
Family 63.1
Sideboard 56.6
Furniture 56.6
Person 44

Clarifai
created on 2023-10-26

people 99.9
group 99.4
adult 98.8
woman 97.1
furniture 96.2
many 96.1
man 95.3
leader 92.7
wear 92.3
illustration 91.8
administration 88.2
several 88.2
home 87.2
war 86.3
family 86
print 83.9
room 83.4
commerce 82.5
group together 82.1
music 81.9

Imagga
created on 2022-01-23

blackboard 25.4
grunge 17.9
art 17.2
structure 16.6
old 15.3
design 14.6
vintage 14.1
menorah 13.3
silhouette 13.2
altar 12.3
retro 12.3
black 12
frame 11.9
pattern 11.6
men 11.2
people 11.1
building 11.1
man 10.8
decoration 10.8
candelabrum 10.7
window 10.1
house 10.1
style 9.6
architecture 9.4
city 9.1
drawing 9.1
history 8.9
graphic 8.7
light 8.7
antique 8.6
male 8.5
texture 8.3
life 8.3
film 8.2
paint 8.1
candlestick 8
interior 8
postage 7.9
percussion instrument 7.8
stamp 7.7
mail 7.7
dirt 7.6
shop 7.6
symbol 7.4
marimba 7.4
business 7.3
border 7.2
dirty 7.2
home 7.2
sky 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 98.7
clothing 93.8
person 93.7
woman 82.3
table 78.3
wedding 68.1
man 57.7
store 40.1
posing 36.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 86.7%
Happy 84.4%
Sad 9.2%
Surprised 1.9%
Calm 1.7%
Confused 1.3%
Disgusted 0.6%
Angry 0.6%
Fear 0.3%

AWS Rekognition

Age 27-37
Gender Female, 54.5%
Calm 64.8%
Sad 31.7%
Confused 1%
Happy 0.9%
Disgusted 0.5%
Angry 0.4%
Fear 0.4%
Surprised 0.3%

AWS Rekognition

Age 25-35
Gender Male, 99.7%
Confused 33.6%
Calm 23.3%
Surprised 14%
Disgusted 10.6%
Sad 10%
Happy 4%
Angry 3.2%
Fear 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Text analysis

Amazon

EIEW
мигсо SALE EIEW
мигсо
SALE

Google

t 3 | S Ə
t
|
S
3
Ə