Human Generated Data

Title

Untitled (nine children posed sitting in front of fireplace decorated with stockings)

Date

1968

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10104

Human Generated Data

Title

Untitled (nine children posed sitting in front of fireplace decorated with stockings)

People

Artist: Martin Schweig, American 20th century

Date

1968

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Person 99.3
Human 99.3
Person 99
Person 98.9
Person 97.1
Person 95.9
Person 95.8
Person 95.4
Interior Design 94.4
Indoors 94.4
Person 94.3
Room 90.7
Living Room 86.8
Person 85.7
People 85.1
Person 69.3
Female 67.3
Curtain 67.1
Furniture 63.8
Clothing 63.4
Apparel 63.4
Text 63.1
Face 62.4
Crowd 61.5
Shorts 57.5
Couch 57.3
Photo 56.7
Photography 56.7
Stage 56.1

Imagga
created on 2022-01-28

freight car 100
car 91.2
wheeled vehicle 72.7
vehicle 45.8
blackboard 39.6
conveyance 24.2
television 20.3
grunge 17
frame 16.6
monitor 16.3
screen 15.8
black 15.6
old 15.3
vintage 14.9
texture 14.6
digital 14.6
design 14.1
computer 13.2
antique 13
business 12.7
film 12.7
border 12.6
dirty 12.6
retro 12.3
entertainment 12
chalkboard 11.8
art 11.7
movie 11.6
pattern 11.6
cinema 11.3
blank 11.1
room 10.5
wall 10.3
graphic 10.2
finance 10.1
man 10.1
people 10
board 9.9
display 9.9
technology 9.6
building 9.6
damaged 9.5
education 9.5
grungy 9.5
classroom 9.4
space 9.3
theater 9.3
rough 9.1
paint 9
aged 9
structure 9
class 8.7
paper 8.6
empty 8.6
architecture 8.6
decoration 8.4
back 8.3
school 8.1
home 8
interior 7.9
billboard 7.9
scratches 7.9
text 7.8
frames 7.8
noise 7.8
slide 7.8
scratch 7.8
flat 7.7
collage 7.7
edge 7.7
window 7.4
equipment 7.4
silhouette 7.4
material 7.1
male 7.1
information 7.1

Google
created on 2022-01-28

Black 89.5
Picture frame 86.6
Rectangle 85.2
Font 83.6
Art 79.9
Adaptation 79.3
Curtain 75.5
Event 73.7
Monochrome photography 70.7
Monochrome 68.1
Visual arts 67.6
Room 65.6
Photo caption 63.3
History 62
Suit 59.9
Illustration 59.3
Wood 57.4
Painting 56.2
Interior design 54.6
Grass 53.2

Microsoft
created on 2022-01-28

text 97.2
person 86.7
drawing 81.8
clothing 51.6
room 50.1

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Female, 66.1%
Sad 73.2%
Disgusted 14.4%
Confused 4%
Happy 2.4%
Angry 2.2%
Surprised 1.6%
Calm 1.4%
Fear 0.9%

AWS Rekognition

Age 38-46
Gender Male, 99.5%
Calm 97.2%
Sad 1.3%
Confused 0.6%
Angry 0.2%
Happy 0.2%
Disgusted 0.2%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 45-53
Gender Male, 99.5%
Calm 73.6%
Sad 17.2%
Confused 3.5%
Happy 2.1%
Surprised 1.5%
Disgusted 1.4%
Angry 0.4%
Fear 0.3%

AWS Rekognition

Age 39-47
Gender Male, 98.3%
Sad 29.8%
Calm 25.3%
Happy 20.3%
Disgusted 10.3%
Angry 5%
Surprised 4.9%
Fear 2.3%
Confused 2.1%

AWS Rekognition

Age 50-58
Gender Male, 98.6%
Sad 53%
Disgusted 29.2%
Calm 9.7%
Confused 4.6%
Fear 1.6%
Surprised 0.7%
Angry 0.6%
Happy 0.6%

AWS Rekognition

Age 21-29
Gender Male, 99.8%
Sad 58.7%
Calm 18%
Confused 11%
Angry 4.7%
Disgusted 2.6%
Surprised 1.9%
Happy 1.9%
Fear 1.2%

AWS Rekognition

Age 27-37
Gender Male, 99.2%
Sad 74.4%
Angry 15.7%
Calm 4.4%
Disgusted 1.7%
Fear 1.2%
Confused 0.9%
Surprised 0.9%
Happy 0.8%

AWS Rekognition

Age 33-41
Gender Female, 57.3%
Fear 18.6%
Disgusted 15.7%
Sad 15.3%
Happy 14.6%
Confused 14.4%
Surprised 10.4%
Angry 7.4%
Calm 3.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a group of people in a room 77.6%
a group of people standing in a room 69.9%
a group of people posing for a photo 50.8%

Text analysis

Amazon

KODAK
2
FILM
3
JULiE
FILI
KODAK SAFETY
SAFETY
S'AFETY
Betty
DATE

Google

Juie
FILE
S'AFETY
Juie FILM ODAK S'AFETY FILE KODAK S'AFETY
FILM
KODAK
ODAK