Human Generated Data

Title

Abu Simbel

Date

2005-2006

People

Artist: Ellen Gallagher, American 1965 -

Publisher: Two Palms Press, Inc.,

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, 2006.82

Copyright

© Ellen Gallagher

Human Generated Data

Title

Abu Simbel

People

Artist: Ellen Gallagher, American 1965 -

Publisher: Two Palms Press, Inc.,

Date

2005-2006

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Margaret Fisher Fund, 2006.82

Copyright

© Ellen Gallagher

Machine Generated Data

Tags

Amazon
created on 2019-04-06

Person 94.6
Human 94.6
Poster 93.5
Advertisement 93.5
Person 90.9
Art 82.2
Person 67.2

Clarifai
created on 2018-04-19

people 98.7
man 95.4
art 94.6
illustration 94.4
adult 91.4
group 91.2
vehicle 84.6
travel 80.7
old 78
print 77.5
religion 77
symbol 75.7
painting 75.7
woman 75.1
weapon 74.8
music 74.8
wear 74.5
transportation system 74.4
desktop 73.8
war 73.7

Imagga
created on 2018-04-19

blackboard 67.9
graffito 41.2
decoration 27.9
old 22.3
chalk 20.8
vintage 19.8
structure 19.4
grunge 18.7
texture 17.4
retro 16.4
board 15.2
business 15.2
black 15
money 14.5
ancient 13.8
education 13
antique 13
symbol 12.8
chalkboard 12.7
finance 12.7
design 12.6
frame 12.5
financial 12.5
pattern 12.3
letter 11.9
message 11.9
aged 11.8
paper 11.8
graphic 11.7
currency 11.7
material 11.6
learn 11.3
sign 11.3
note 11
cash 11
text 10.5
drawing 10.3
memorial 9.7
write 9.4
art 9.3
dollar 9.3
back 9.2
stamp 9.1
wealth 9
school 9
spot 8.6
exchange 8.6
space 8.5
plan 8.5
billboard 8.4
study 8.4
wall 8.2
border 8.1
lace 8.1
bank 8.1
idea 8
container 8
postmark 7.9
postage 7.9
written 7.9
classroom 7.8
teacher 7.7
wallpaper 7.7
mail 7.7
card 7.7
worn 7.6
damaged 7.6
canvas 7.6
clothing 7.6
learning 7.5
savings 7.5
banking 7.4
historic 7.3
backgrounds 7.3
success 7.2
dirty 7.2

Google
created on 2018-04-19

art 85.9
mural 65.2
visual arts 61.9

Microsoft
created on 2018-04-19

text 97.6
book 93.5
old 46.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Male, 75.2%
Calm 54.3%
Sad 24.8%
Disgusted 2.3%
Surprised 5.1%
Angry 5.2%
Happy 4.8%
Confused 3.4%

AWS Rekognition

Age 11-18
Gender Female, 50.5%
Angry 49.5%
Surprised 49.5%
Confused 49.5%
Disgusted 49.5%
Happy 50.4%
Sad 49.6%
Calm 49.5%

AWS Rekognition

Age 10-15
Gender Male, 51.5%
Surprised 45.4%
Calm 52.4%
Sad 45.8%
Disgusted 45.1%
Angry 45.3%
Happy 45.8%
Confused 45.2%

AWS Rekognition

Age 26-43
Gender Male, 54.1%
Disgusted 47.6%
Happy 45.4%
Confused 45.6%
Calm 48.2%
Sad 46.7%
Surprised 45.6%
Angry 46.1%

AWS Rekognition

Age 35-53
Gender Male, 54.4%
Sad 46%
Surprised 45.2%
Angry 53.2%
Happy 45.1%
Calm 45.1%
Confused 45.3%
Disgusted 45.1%

AWS Rekognition

Age 35-53
Gender Female, 50%
Angry 49.5%
Happy 49.5%
Disgusted 49.5%
Confused 49.5%
Surprised 49.5%
Sad 50.4%
Calm 49.5%

AWS Rekognition

Age 35-52
Gender Female, 80.4%
Disgusted 1.5%
Surprised 3%
Angry 4.3%
Happy 2.3%
Calm 11.5%
Sad 65.2%
Confused 12.1%

AWS Rekognition

Age 48-68
Gender Female, 73.3%
Angry 1.8%
Confused 2.2%
Calm 85.3%
Happy 1.7%
Disgusted 1.6%
Sad 2.9%
Surprised 4.4%

AWS Rekognition

Age 29-45
Gender Male, 50.2%
Angry 49.7%
Happy 49.5%
Surprised 49.5%
Sad 50%
Disgusted 49.6%
Confused 49.5%
Calm 49.6%

AWS Rekognition

Age 26-43
Gender Female, 51.4%
Happy 45.3%
Sad 45.2%
Disgusted 45.3%
Calm 50.5%
Angry 47.8%
Surprised 45.5%
Confused 45.3%

AWS Rekognition

Age 20-38
Gender Male, 52.6%
Confused 49.3%
Surprised 46.3%
Calm 46.2%
Sad 45.9%
Disgusted 45.6%
Angry 46.1%
Happy 45.5%

AWS Rekognition

Age 26-43
Gender Male, 50%
Disgusted 49.5%
Angry 49.6%
Happy 49.6%
Surprised 49.6%
Sad 49.5%
Calm 50.1%
Confused 49.6%

AWS Rekognition

Age 10-15
Gender Male, 54.7%
Surprised 45.9%
Angry 46.7%
Confused 48.4%
Disgusted 45.4%
Happy 45.6%
Sad 45.3%
Calm 47.5%

Microsoft Cognitive Services

Age 30
Gender Male

Microsoft Cognitive Services

Age 6
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 94.6%
Poster 93.5%

Captions

Azure OpenAI

Created on 2024-02-06

This image contains a mix of surreal and classical elements, featuring a historic relief of seated figures that appear to be carved into a rocky cliff. A staircase with red glowing steps leads up to the figures. Scattered about are various abstract shapes and whimsical elements, like floating stars and a peculiar flying object in the upper left corner, with eyes and fluffy blue tufts. In the lower left corner, a jumble of black circular forms spills over a vintage television set. The overall scene is a blend of grayscale and vivid colors, creating a dreamlike atmosphere.

Anthropic Claude

Created on 2024-03-29

The image depicts a surreal and dreamlike scene. In the center, there is a large, colorful, and whimsical creature with a yellow disc-shaped body and striking blue and red feathers. This creature appears to be a fantastical or imaginary being. In the background, there is a black and white scene showing several human figures, some of which appear to be in a state of distress or hardship. The figures are depicted in a stark, shadowy style, creating a sense of mystery and unease. The overall composition blends the fantastical and the realistic, creating a striking contrast and a sense of the uncanny. The use of light, shadow, and color contributes to the surreal and dreamlike quality of the image.

Text analysis

Amazon

eteaote