Human Generated Data

Title

Untitled

Date

2005

People

Artist: David Levinthal, American born 1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous gift, 2017.280

Human Generated Data

Title

Untitled

People

Artist: David Levinthal, American born 1949

Date

2005

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Anonymous gift, 2017.280

Machine Generated Data

Tags

Amazon
created on 2019-04-10

Human 97
Person 97
Person 96.8
Electronics 85.6
Screen 85.6
Display 85.6
Monitor 85.6
Art 76.9
Figurine 76.2
Apparel 71.6
Clothing 71.6
Painting 59.1

Clarifai
created on 2018-11-05

people 99.2
adult 97.8
wear 97.7
painting 96.7
exhibition 95.5
music 94.7
woman 94.3
man 94
one 93.5
picture frame 92.7
art 92.1
indoors 89.3
facial expression 89
screen 88.1
group 87.9
portrait 87.8
museum 87.1
child 85.6
furniture 85
movie 83.5

Imagga
created on 2018-11-05

person 28.9
black 28.6
adult 24
people 22.3
man 21.5
male 19.3
portrait 18.1
dark 17.5
one 17.2
model 16.3
attractive 15.4
sexy 15.3
lady 14.6
body 14.4
posing 14.2
fashion 13.6
human 13.5
dress 12.6
device 12.6
style 12.6
love 11.8
art 11.6
silhouette 11.6
passion 11.3
sensuality 10.9
elevator 10.8
looking 10.4
symbol 10.1
sensual 10
pretty 9.8
dance 9.7
women 9.5
culture 9.4
expression 9.4
figure 9.3
vintage 9.1
religion 9
clothing 8.8
lifting device 8.6
dancer 8.6
performer 8.6
elegant 8.6
youth 8.5
window 8.5
clothes 8.4
elegance 8.4
church 8.3
music 8.1
suit 8.1
lifestyle 7.9
couple 7.8
dancing 7.7
god 7.7
fine 7.6
wife 7.6
leisure 7.5
action 7.4
emotion 7.4
slim 7.4
light 7.3
alone 7.3
pose 7.2
hair 7.1

Google
created on 2018-11-05

art 87.5
picture frame 69.5
painting 59.5
visual arts 52.7
modern art 51.8

Microsoft
created on 2018-11-05

monitor 99.1
electronics 97.4
screen 86.9
television 85.2
display 82.9
picture frame 43.1
computer 41.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 10-15
Gender Female, 99.8%
Disgusted 0.7%
Calm 87.7%
Surprised 0.8%
Angry 1.2%
Happy 1.2%
Sad 7.1%
Confused 1.3%

AWS Rekognition

Age 26-43
Gender Male, 71.2%
Surprised 3.1%
Calm 51.2%
Angry 4.2%
Sad 34.4%
Happy 2.6%
Confused 2.9%
Disgusted 1.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97%
Monitor 85.6%

Categories

Captions

Azure OpenAI

Created on 2024-11-18

The image depicts a small-scale nativity scene with figures representing the birth of Jesus. There is a figure that appears to be Mary, dressed in robes and kneeling by a manger, where a small infant figure lies swaddled. Another robed figure, likely representing Joseph, is standing to the side with what appears to be a staff. The setting appears simplistic with only a few pieces of greenery to suggest a stable or manger setting. The colors of the figurines are muted, with dark browns and soft whites, creating a sense of antiquity. On the border of the image, there is a signature and a date indicating that this is likely a photograph of the scene. The signature reads "Dale C... 2007 1/1," suggesting that this is a limited edition print, being the first in a series or a unique print. The photograph itself has a classic aesthetic, with a warm orange border that suggests it could be a Polaroid or a similar style of instant film photograph, known for their distinctive border frames.

Anthropic Claude

Created on 2024-11-18

The image depicts a traditional nativity scene, featuring figurines representing the baby Jesus, Mary, and Joseph. The figures are positioned in a darkened environment, with only a soft, warm light illuminating the scene. The pose and positioning of the figures convey a sense of reverence and wonder surrounding the birth of the Christ child. The overall tone of the image is one of solemnity and contemplation.

Text analysis

Amazon

1/
uteQ